The present invention relates to a motion picture editing apparatus for and method of editing a motion picture, such as video, recorded by a video camera or the like, as well as a computer program which makes a computer function as such a motion picture editing apparatus.
An operation of editing video filmed or recorded by a video camera is widely performed not only by experts but also by the general public. The video editing is generally performed by using equipment for exclusive use or a personal computer after the recording or filming.
In the video editing, such operations are performed that the recorded video is reproduced and that the video is cut at the start point and end point of an unnecessary scene (or a scene a user desires to save) while the a user confirms the content of the video to delete the unnecessary scene. In such operations, the user needs to confirm almost all the video content. Thus, particularly if the video time is long, such as over several ten minutes or 1 hour, a time and energy or effort required for the operations increase, which is problematic. This almost or completely discourages the user from performing the video editing, and the user tends to save the recorded video as it is on a recording medium, such as a digital video tape, a DVD, and a hard disk. The as-recorded video tends to include the unnecessary scene, such as a scene failing to be recorded and a scene needlessly recorded. Thus, the fact that the as-recorded video is saved on the recording medium also leads to such problems that the recording medium is wasted and that the recording medium is not reused.
Therefore, for example, a patent document 1 discloses a technology of automatically deleting the unnecessary scene, such as an unsightly scene caused by an operation error, camera shake, and the like.
In the technology disclosed in the patent document 1, however, since the unnecessary scene is automatically deleted, if the editing result is different from the user's will, such as a case where the unnecessary scene remains in the editing result and a case where an important scene is deleted, the user has to start the editing over. In addition, the operation required for the re-editing is almost the same as the operation required for the first editing, which is problematic. Thus, even if such an operation is repeated, there is a possibility that the user cannot obtain the desired editing result. Even if the user can obtain the desired editing result, the time and energy or effort required for the operation repeated until the user obtains the desired editing result likely increase.
In view of the aforementioned problems, it is therefore an object of the present invention to provide, for example, a motion picture editing apparatus and method which enables the user to easily perform the editing operation and the confirmation operation, as well as a computer program which makes a computer function as such a motion picture editing apparatus.
The above object of the present invention can be achieved by a motion picture editing apparatus provided with: a motion picture analyzing device for analyzing a motion picture, thereby obtaining characteristics of the motion picture; a specifying device capable of specifying characteristics of a scene to be identified as an editing target, of the motion picture; an identifying device for identifying a scene with characteristics which match the specified characteristics of the motion picture, as the editing target; and a presenting device for presenting scene information including start and end time points of the identified scene of the motion picture.
According to the motion picture editing apparatus of the present invention, in the editing of the motion picture, firstly, the motion picture is analyzed by the motion picture analyzing device, thereby obtaining the characteristics of the motion picture. Here, the “characteristics of the motion picture” in the present invention means the characteristics of the motion picture caused by the filming or recording, such as a camera shake, zoom speed, and panning variation (i.e. variation in a horizontal direction of the motion picture generated in the filming or recording, with a video camera intentionally shaking in the horizontal direction), included in the motion picture filmed or recorded by a video camera or the like. More specifically, the motion picture analyzing device analyzes the characteristics of each of a plurality of frames which constitute the motion picture, for example, such as chromatic characteristics, luminance characteristics, motion characteristics, and spatial frequency characteristics. Incidentally, the obtained characteristics of the motion picture are recorded in a memory device owned by the motion picture analyzing device or externally provided.
Then, if the characteristics of the scene to be identified as the editing target are specified by the specifying device, the scene with the characteristics which match the specified characteristics is identified by the identifying device. In other words, for example, if it is specified as the characteristics of the scene to be identified that the “camera shake” is greater than or equal to a predetermined threshold value from a user by using the specifying device, the identifying device identifies a scene in which the “camera shake” is greater than or equal to the predetermined threshold value, from a plurality of scenes included in the motion picture, on the basis of the characteristics of the motion picture obtained by the motion picture analyzing device. Thus, the user can intuitively or collectively identify the editing target by specifying the characteristics of the scene the user desires to identify as the editing target by using the specifying device.
Then, the scene information including the start and end time points of the scene identified by the identifying device is presented by the presenting device. Here, the “scene information” of the present invention means information for identifying the scene, which includes the start time point at which the scene starts and the end time point at which the scene ends. More specifically, the presenting device presents the scene information by graphically displaying the start and end time points of the identified scene on the screen of the displaying device. Thus, the user can easily perform an operation of confirming the scene identified as the editing target by looking at the presented scene information.
As explained above, according to the motion picture editing apparatus of the present invention, the user can intuitively or collectively identify the editing target by specifying the characteristics of the scene the user desires to identify as the editing target by using the specifying device. Moreover, the user can easily perform the operation of confirming the scene identified as the editing target by looking at the presented scene information. As a result, it is possible to reduce the time and energy or effort required for the user's confirmation operation; namely, the user can easily perform the confirmation operation.
In one aspect of the motion picture editing apparatus of the present invention, the motion picture analyzing device is provided with at least one of characteristic analyzing devices which are: a chromatic characteristic analyzing device for analyzing chromatic characteristics in each of a plurality of frames which constitute the motion picture; a luminance characteristic analyzing device for analyzing luminance characteristics in each of the plurality of frames which constitute the motion picture; a motion characteristic analyzing device for analyzing motion characteristics in each of the plurality of frames which constitute the motion picture; and a spatial frequency characteristic analyzing device for analyzing spatial frequency characteristics in each of the plurality of frames which constitute the motion picture.
According to this aspect, the chromatic characteristic analyzing device analyzes chromatic characteristics in each frame (e.g. a dominant color, color ratio, or the like in each frame). The luminance characteristic analyzing device analyzes luminance characteristics in each frame (e.g. average brightness, maximum brightness, minimum brightness or the like in each frame). The motion characteristic analyzing device analyzes motion characteristics in each frame (e.g. the distribution of overall or local motion vectors between the frame and frames arranged in tandem. The spatial frequency characteristic analyzing device analyzes spatial frequency characteristics in each frame (e.g. the distribution of frequency components in each frame by FFT (Fast Fourier Transform) or DCT (Discrete Cosine Transform). The motion picture analyzing device has at least one of the characteristic analyzing devices which are the chromatic characteristic analyzing device, the luminance characteristic analyzing device, the motion characteristic analyzing device, and the spatial frequency characteristic analyzing device, so that it can certainly obtain the characteristics of the motion picture.
In another aspect of the motion picture editing apparatus of the present invention, the specifying device can specify a type and level of the characteristics of the scene to be identified, and the identifying device judges whether or not the characteristics of the specified type match the specified characteristics on the basis of the specified level.
According to this aspect, the user can identify the scene to be identified as the editing target by specifying the type of the characteristics (e.g. a camera shake, zoom speed, panning variation, or the like) and its level (i.e. the magnitude of a characteristic amount indicating the extent of the characteristics, such as large, medium, and small).
In another aspect of the motion picture editing apparatus of the present invention, it is further provided with: a history holding device for holding a history of the scene information associated with the identified scene; and a comparing device for comparing at least two pieces of scene information included in the held history, thereby extracting a different portion in which the at least two pieces of scene information are different from each other or a common portion in which the at least two pieces of scene information are common with each other, the presenting device further presenting the different portion or the common portion.
According to this aspect, the history holding device holds the history of the scene information associated with the scene identified by the identifying device. In other words, the history holding device records the scene information including the start and end time points of the identified scene into a memory device owned by the history holding device or externally provided, every time the scene is identified by the identifying device. In other words, every time the user specifies the characteristics of the scene to be identified as the editing target by using the specifying device (i.e. every time the user performs one editing operation), the scene which matches the specified characteristics is identified by the identifying device and the scene information associated with the identified scene is held by the history holding device.
The comparing device compares at least two pieces of scene information specified by the user from among a plurality of pieces of scene information included in the held history, thereby extracting the different portion or common portion between the at least two pieces of scene information. In other words, for example, a difference between the scene information associated with the scene identified when the user performs the first editing operation (i.e. when the user firstly specifies the characteristics by using the specifying device) and the scene information associated with the scene identified when the user performs the second editing operation (i.e. when the user secondly specifies the characteristics by using the specifying device) is extracted by the comparing device.
The different portion or common portion extracted in this manner is presented to the user by the presenting device. Thus, the user can recognize the different portion or common portion. Therefore, the user can easily determine the editing operation to be performed next in order to obtain the user's desired editing result, on the basis of the different portion or common portion. As a result, the user can perform the editing operation, more easily.
In another aspect of the motion picture editing apparatus of the present invention, the presenting deice has a reproducing device for reproducing the identified scene.
According to this aspect, the user can confirm the content of the identified scene. Thus, the user can easily judge whether or not the identified scene is the user's desired scene.
In an aspect in which the history holding device and the comparing device are further provided, as described above, the presenting deice may have a reproducing device for reproducing a scene corresponding to the different portion or the common portion.
In this case, the user can confirm the content of the different portion or common portion. Thus, the user can easily determine the editing operation to be performed next in order to obtain the user's desired editing result, for example, by confirming only the content of the different portion between the two pieces of scene information. Here, in particular, in comparison with a case where all the content of the editing result (e.g. the motion picture after automatically deleting an unnecessary scene) is confirmed in each editing operation (e.g. every time the unnecessary scene is automatically deleted from the original motion picture), it is possible to reduce a wasteful confirmation operation, and it is possible to certainly reduce the time and energy or effort required for the editing operation until the user obtains the desired editing result.
Incidentally, the reproducing device may selectively reproduce one portion of the scene corresponding to the different portion or common portion in accordance with the user's instruction.
The above object of the present invention can be also achieved by a motion picture editing method provided with: a motion picture analyzing process of analyzing a motion picture, thereby obtaining characteristics of the motion picture; a specifying process of specifying characteristics of a scene to be identified as an editing target, of the motion picture; an identifying process of identifying a scene with characteristics which match the specified characteristics of the motion picture, as the editing target; and a presenting process of presenting scene information including start and end time points of the identified scene of the motion picture.
According to the motion picture editing method of the present invention, it is possible to receive the same various effects as those received by the aforementioned motion picture editing apparatus of the present invention.
Incidentally, the motion picture editing method of the present invention can also adopt the same various aspects as those of the aforementioned motion picture editing apparatus of the present invention.
The above object of the present invention can be also achieved by a computer program for making a computer function as: a motion picture analyzing device for analyzing a motion picture, thereby obtaining characteristics of the motion picture; a specifying device capable of specifying characteristics of a scene to be identified as an editing target, of the motion picture; an identifying device for identifying a scene with characteristics which match the specified characteristics of the motion picture, as the editing target; and a presenting device for presenting scene information including start and end time points of the identified scene of the motion picture.
According to the computer program of the present invention, the aforementioned motion picture editing apparatus of the present invention can be relatively easily realized as a computer provided in the motion picture editing apparatus reads and executes the computer program from a program storage device, such as a ROM, a CD-ROM, a DVD-ROM, and a hard disk, or as it executes the computer program after downloading the program through a communication device. This enables the user to easily perform the editing operation and the confirmation operation, as in the aforementioned motion picture editing apparatus of the present invention
Incidentally, the computer program of the present invention can also adopt the same various aspects as those of the aforementioned motion picture editing apparatus of the present invention.
As explained in detail above, according to the motion picture editing apparatus of the present invention, it is provided with the motion picture analyzing device, the specifying device, the identifying device, and the present device. According to the motion picture editing method of the present invention, it is provided with the motion picture analyzing process, the specifying process, the identifying process, and the present process. Thus, the user can easily perform the editing operation and the confirmation operation. According to the computer program of the present invention, it makes a computer function as the motion picture analyzing device, the specifying device, the identifying device, and the present device. Thus, the aforementioned motion picture editing apparatus can be constructed, relatively easily.
The operation and other advantages of the present invention will become more apparent from the embodiment explained below.
Hereinafter, an embodiment of the present invention will be explained with reference to the drawings.
A motion picture editing apparatus in a first embodiment will be explained.
Firstly, the structure of the motion picture editing apparatus in the first embodiment will be explained with reference to
In
As shown in
The video data storage device 100 has the data inputted, wherein the data is about video recorded by the video camera or the like (hereinafter referred to as “video data”), and it accumulates the inputted video data. The video data storage device 100 includes a recording medium, such as a hard disk and a memory.
The video analysis device 200 is one example of the “motion picture analyzing device” of the present invention. The video analysis device 200 has the video data inputted and analyzes the inputted video data, thereby obtaining the characteristics of the video data. More specifically, the video analysis device 200 has a time information extraction device 210, a chromatic characteristic analysis device 220, a luminance characteristic analysis device 230, a motion characteristic analysis device 240, a spatial frequency characteristic analysis device 250, a characteristic data generation device 260, and a characteristic data storage device 270.
The time information extraction device 210 extracts (or separates) time information, such as a frame number and a time code, included in the video data.
The chromatic characteristic analysis device 220 analyzes the characteristics of color (i.e. chromatic characteristics) in each of the plurality of frames which constitute the video associated with the video data. The chromatic characteristic analysis device 220 extracts, for example, a dominant color and a color ratio in each frame, as the chromatic characteristics.
The luminance characteristic analysis device 230 analyzes the characteristics of brightness (i.e. luminance characteristics) in each of the plurality of frames which constitute the video associated with the video data. The luminance characteristic analysis device 230 extracts, for example, average brightness, maximum brightness, and minimum brightness in each frame, as the luminance characteristics.
The motion characteristic analysis device 240 analyzes the characteristics of motion (i.e. motion characteristics) in each of the plurality of frames which constitute the video associated with the video data. The motion characteristic analysis device 240 extracts, for example, camera work information (i.e. a direction and a speed in which the video camera moves) and motion area information (i.e. the number, position, and dimensions of areas moving in the video) in each frame, as the motion characteristics, by analyzing the distribution of overall or local motion vectors between the frame and frames arranged in tandem.
The spatial frequency characteristic analysis device 250 analyzes the characteristics of a spatial frequency (i.e. spatial frequency characteristics) in each of the plurality of frames which constitute the video associated with the video data. The spatial frequency characteristic analysis device 250 calculates a frequency component by FFT or DCT or the like in each of divisional domains to which each frame is divided, and it extracts low-frequency domain information (i.e. the number, position, and dimensions of domains having frequency components that are lower than a predetermined frequency) and high-frequency domain information (i.e. the number, position, and dimensions of domains having frequency components that are higher than a predetermined frequency), as the spatial frequency characteristics.
The characteristic data generation device 260 generates characteristic data on the basis of the time information extracted by the time information extraction device 210 and each of the analysis results (i.e. each of the extracted characteristics) obtained by the chromatic characteristic analysis device 220, the luminance characteristic analysis device 230, the motion characteristic analysis device 240, and the spatial frequency characteristic analysis device 250.
As shown in
In
Incidentally, the video analysis device 200 may have the time information inputted separated from the video data. In this case, the video analysis device 200 can be constructed not to have the time information extraction device 210. Moreover, in addition to the chromatic characteristic analysis device 220, the luminance characteristic analysis device 230, the motion characteristic analysis device 240, and the spatial frequency characteristic analysis device 250, another characteristic analyzing device for analyzing another characteristic of the video may be added, so that an analysis result by the other characteristic analyzing device may be included in the characteristic data. Moreover, a plurality of characteristic data may be generated on the basis of each of the analysis results analyzed by the chromatic characteristic analysis device 220, the luminance characteristic analysis device 230, the motion characteristic analysis device 240, and the spatial frequency characteristic analysis device 250. In other words, the characteristic data may not be integrated by the frame unit.
In
The characteristic specification device 310 is constructed such that a user can specify the characteristics of a scene to be identified as an editing target, from the video associated with the video data. The characteristic specification device 310 has a GUI (Graphical User Interface) for the user to input the characteristics of the scene to be specified as the editing target.
As shown in
Incidentally, the “blocked up shadow” is a criterial characteristic to judge whether or not a scene backlighted in the filming is identified as the editing target. The “blurred area” is a criterial characteristic to judge whether or not a scene out of focus in the filming (i.e. blurred scene) is identified as the editing target.
The GUI 170 is provided with editing buttons 71 and 72. By the editing button 71 being pressed (or selected) by the user, video editing (the deletion of the scene to be identified by the scene identification device 320 in the embodiment) is performed. By the editing button 72 being pressed by the user, the video editing most recently performed is canceled.
As shown as a modified example in
In other words, the characteristic specification device may specify the type and level of the scene to be identified as the editing target by the user inputting the level of the characteristics corresponding to each of characteristic specification devices C1 to Cn (wherein n is a natural number) of the GUI 80 to respective one of the characteristic specification devices C1 to Cn. By an editing button 81 being pressed by the user, the video editing may be performed, and by an editing button 82 being pressed by the user, the video editing most recently performed may be canceled.
In
As shown in
In
Moreover, the scene identification device 320 generates scene information including the start and end time points of the identified scene and outputs it to the history holding device 330 described later.
The history holding device 330 holds a history of the scene information inputted from the scene identification device 320. The history holding device 330 includes a recording medium, such as a hard disk and a memory.
In
In
In
In
The editing result display device 410 graphically displays the scene information (more specifically, the aforementioned different portion or common portion) inputted from the editing control device 300 (more specifically, the history comparison device 340), as the editing result, on the screen of a display owned by the editing result confirmation device 400 or externally provided.
The reproduction device 420 is adapted to reproduce the scene corresponding to the different portion or common portion of the scene information. In other words, the reproduction device 420 is adapted to read the video data of the scene corresponding to the different portion or common portion of the scene information from the video data storage device 100 and to reproduce it.
As shown in
In the video display area 701, the scene reproduced by the reproduction device 420 is displayed.
The first scene information display device 710 displays the position in the entire video of the scene without the characteristics which match the characteristics identified by the user as the editing target (i.e. the scene that is not identified by the user), from the video associated with the video data. The first scene information display device 710 is displayed in a rectangular shape as a whole, and an unidentified part display 760 which shows the position of the scene that is not identified by the user is displayed in the rectangle.
The second scene information display device 720 displays the position in the entire video of the scene with the characteristics which match the characteristics identified by the user as the editing target (i.e. the scene that is identified by the user), from the video associated with the video data. The second scene information display device 720 is displayed in a rectangular shape as a whole as in the first scene information display device 710, and an identified part display 750 which shows the position of the scene that is identified by the user is displayed in the rectangle.
The identified part display 750 is displayed as a different portion display 771 and a common portion display 772.
The different portion display 771 displays the position of the scene corresponding to the different portion in which one scene information and another scene information are different from each other if the user selects the one scene information and the other scene information included in the history 650 of the scene information are selected by using the scene information selection button 780 described later.
The common portion display 772 displays the position of the scene corresponding to the common portion in which one scene information and another scene information are common with each other if the user selects the one scene information and the other scene information included in the history 650 of the scene information are selected by using the scene information selection button 780 described later.
The different portion display 771 and the common portion display 772 are displayed in different colors or patterns, which are different from each other, and this enables the user to distinguish between the different portion display 771 and the common portion display 772.
More specifically, for example, if the user selects the scene information 600(1) and the scene information 600(2) in the history 650 of the scene information shown in
As described above, the user can distinguish between the different portion display 771 and the common portion display 772, so that the user can easily determine the editing operation to be performed next in order to obtain the user's desired editing result, for example, on the basis of the different portion display 771 or the common portion display 772.
The scale device 730 is displayed in association with one side of each rectangle of the first scene information display device 710 and the second scene information display device 720, and the entire length of the scale device 730 means the length of the entire video associated with the video data.
The reproduction position display 740 shows the reproduction position of the scene reproduced by the reproduction device 420 (in other words, displayed in the video display area 701). In other words, by the reproduction position display 740 being displaced along the scale device 730 in accordance with the reproduction position of the scene reproduced by the reproduction device 420, the user can recognize the reproduction position.
The scene information selection button 780 is a GUI for the user selecting the scene information to be displayed on the first scene information display device 710 and the second scene information display device 720, from the history 650 of the scene information.
The confirmation method selection button 790 is a GUI for the user selecting a display method of displaying the scene in the video display area 701, and the user can select whether all the scenes corresponding to the identified part display 750 are reproduced and displayed or only the scene corresponding to the different portion display 771 of the identified part display 750 is reproduced and displayed. In other words, the confirmation method selection button 790 is provided with selection buttons 791 and 792. The display method of reproducing and displaying all the scenes corresponding to the identified part device 750 is selected by pressing the selection button 791, and the display method of reproducing and displaying only the scene corresponding to the different portion display 771 of the identified part display 750 is selected by pressing the selection button 792.
Next, the operations of the motion picture editing apparatus in the first embodiment will be explained with reference to
Hereinafter, firstly, an explanation will be given on the basic operations of the motion picture editing apparatus 10 when the user edits the video data. Incidentally, when the user edits the video data, firstly, the operations associated with the video data storage device and the video analysis device described later with reference to
In
Then, the video analysis device 200 analyzes the video data, thereby generating the characteristic data (step S12). In other words, the video analysis device 200 firstly extracts the time information by using the time information extraction device 210 and analyzes the video data by using the chromatic characteristic analysis device 220, the luminance characteristic analysis device 230, the motion characteristic analysis device 240, and the spatial frequency characteristic analysis device 250. Then, the video analysis device 200 generates the characteristic data 50 described above with reference to
Then, in
Then, the scene identification device 320 identifies the scene with the characteristics which match the characteristic information included in the control data, thereby generating the scene information (step S22). In other words, the scene identification device 320 searches for the characteristic data (refer to
Then, the history holding device 330 holds the history of the scene information (step S23). In other words, the history holding device 330 holds the history 650 of the scene information described above with reference to
Then, the history comparison device 340 compares the plurality of pieces of scene information, and it extracts the different portion or common portion (step S24). In other words, the history comparison device 340 extracts the different portion in which the two pieces of scene information are different from each other or the common portion in which the two pieces of scene information are common, with regard to the two pieces of scene information specified by the user (e.g. the scene information 600(1) and 600(2)), as described above with reference to
Then, the editing result display device 410 displays the different portion or common portion on the screen (step S25). In other words, the editing result display device 410 displays the unidentified part display 760 which shows the position in the entire video of the scene that is not identified by the user, on the first scene information display device 710, and it also displays the identified part display 750 which shows the position in the entire video of the scene that is specified by the user, on the second scene information display device 720, as described above with reference to
Then, the reproduction device 420 reads the video data corresponding to the different portion or common portion from the video data storage device 100 and reproduces it (step S26). In other words, the reproduction device 420 reproduces the scene corresponding to the different portion or common portion extracted by the history comparison device 340 (i.e. the scene corresponding to the different portion display 771 or common portion display 772 described above with reference to
After confirming the video displayed in the video display area 701 (i.e. the scene corresponding to the different portion display 771 or common portion display 772), the user changes the characteristics of the scene to be identified as the editing target if the editing result is not the user's desired result, and the user specifies it again by using the characteristic specification device 310 (the step S21). The series of processes in the step S21 to the step S25 is repeated until the editing result matches the user's desired result.
Next, an explanation will be given on one example of the editing result displayed by the editing result display device when the user edits the video data by using the motion picture editing apparatus in the first embodiment, with reference to
As shown in
Incidentally, in the examples in
Then, as shown in
Here, in
Moreover, the scene corresponding to the different portion display 771 is reproduced by the reproduction device 420 and displayed in the video display area 701. Thus, the user can easily perform the operation of confirming the scene identified as the editing target. In other words, by displaying the scene corresponding to the different portion display 771 in the video display area 701, it is possible to easily confirm the difference (in other words, the changed part) between the editing result by the latest editing operation and the editing result by the editing operation immediately before the latest editing operation; namely, it is possible to reduce the time required for the confirmation of the editing result, in comparison with a case where the user confirms all the edited video at each time of the editing operation.
Then, as shown in
Then, as shown in
Next, with reference to
Incidentally, in the examples in
As shown in
As shown in
The present invention can be also applied to a HDD recorder, DVD recorder, video editing software, video-editing-function cam recorder, or the like, in addition to the motion picture editing apparatus explained in the aforementioned embodiment.
The present invention is not limited to the aforementioned examples, but various changes may be made, if desired, without departing from the essence or spirit of the invention which can be read from the claims and the entire specification. A motion picture editing apparatus and a motion picture editing method, all of which involve such changes, are also intended to be within the technical scope of the present invention.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2007/065542 | 8/8/2007 | WO | 00 | 5/24/2010 |