The present application claims priority from Japanese application JP2005-120484 filed on Apr. 19, 2005, the content of which is hereby incorporated by reference into this application.
The present invention relates to an apparatus for processing moving pictures to reproduce video data.
Recent advances in digital television broadcast technologies bring rapid growth in multi-channel broadcasting of video or video data and also result in wider frequency bands of networks. This in turn enabled acquisition or audio-visual enjoyment of a great amount of video data. In addition, owing to improvements in video compression/decompression techniques and price reduction of hardware/software for achieving them along with an increase in capacity of storage media and a decrease in costs thereof, it has become possible to readily perform the saving of an increased amount of video data, which leads to a likewise increase in watchable video data. However, busy persons usually have no or less time to watch every part of the video data, resulting in overflow of watchable video data in some circumstances. Consequently, it becomes important to provide a technique for allowing a user to selectively watch and listen to only his or her preferred or “important” scenes in the video data to thereby enable establishment of a scheme for understanding the contents of interest within a short length of time period and a system for permitting the user to quickly search specific part of the video data that s/he truly wants to watch.
In light of the technical background, an exemplary approach to enabling on-screen visualization of only important or highlight scenes in video data is disclosed in JP-A-2003-153139. Another selective scene display technique is found in D. DeMenthon, V. Kobla, and D. Doermann, “Video Summarization by Curve Simplification”, ACM Multimedia 98, Bristol, England, (pp. 211-218, 1998).
In particular, the DeMenthon et al. article discloses therein a technique for generating characteristic portions from video data and for extracting and ranking highlight scenes based on the features to thereby reproduce highlight scenes only at a user-assigned scene-skip rate.
Although several techniques for allowing the user to grasp the contents of video data in a short time period are proposed, it seems that the proposed techniques fail to provide user interfaces preferable to end users. For example, in JP-A-2003-153139, it is possible to watch every scene that appears to be important. Unfortunately, it suffers from a problem as to the lack of an ability to partially or entirely watch important video data parts within a time period convenient to the user, because it is impossible to assign a playback time and playback percentage. Regarding the technique taught from DeMenthon document, it is difficult or almost impossible for the user to figure out exactly how to determine an appropriate scene skip ratio in order to achieve effective viewing of highlight scenes only, although an ability is provided to play back only important scenes at a ratio which is manually assigned by the user.
This invention was made to avoid the problems in the prior art, and it is an object of the invention to provide a video processing apparatus capable of permitting users to effectively grasp the contents of video data.
To attain the foregoing object, a video processing apparatus in accordance with one aspect of the invention is arranged to include a video data input unit for inputting video data, a highlight scene data input/generation unit for inputting or generating highlight scene data with a description of an important scene or scenes in the video data, a default playback parameter determination unit for determining a default playback parameter based on the highlight scene data entered or generated by the highlight scene data input/generation unit, a playback parameter input unit for input of a parameter for determination of a playback scene(s), and a control device which provides control in such a way as to preferentially use, when the playback parameter is input by the playback parameter input unit, the playback parameter as input by the playback parameter input unit rather than the playback parameter determined by the default playback parameter determination unit to reproduce the playback scene(s) of the video data.
According to the invention, it becomes possible to effectively catch the contents of the video data, thereby improving the usability of end users.
Other objects, features and advantages of the invention will become apparent from the following description of the embodiments of the invention taken in conjunction with the accompanying drawings.
As shown in
The video data input device 100 inputs video or video data. This input device 100 may typically be comprised of a device which reads the video data being stored in the memory device 105 or secondary storage device 106 in a way to be later described or, alternatively, a television (TV) tuner in the case of receiving broadcast TV programs. When inputting video data via network links, the video data input device 100 is configurable from a network card, such as a local area network (LAN) card or the like.
The CPU 101 is mainly arranged by a microprocessor, which is a control unit that executes software programs as stored in the storage device 105 or secondary storage device 106.
The input device 102 is realizable, for example, by a remote control, keyboard, or pointing device called the “mouse,” for enabling a user to enter more than one playback scene determination parameter, which will be discussed later.
The display device 103 is configurable, for example, by a display adapter and a liquid crystal display (LCD) panel or projector or else. When performing entry of one or some playback scene images and/or a playback scene determination parameter(s) via a graphical user interface (GUI), it displays this GUI. One example of this GUI will be described in detail later.
The audio output device 104 is arranged, for example, to include a speaker(s) for outputting sounds and voices of the scenes being reproduced.
The storage device 105 is implemented, for example, by a random access memory (RAM) or read-only memory (ROM) or equivalents thereto, for storing therein a software program(s) to be executed by the CPU 101 and the data to be processed by this video processing apparatus or, alternatively, video data to be reproduced and/or ranking data relating thereto.
The secondary storage device 106 is designable to include, for example, a hard disk drive (HDD) or a digital versatile disk (DVD) drive or a compact disc (CD) drive or a nonvolatile memory, such as “Flash” memory or the like. The secondary storage 106 stores therein a software program(s) to be executed by the CPU 101 and the data being processed by this video processing apparatus or, alternatively, the video data to be played back and/or the ranking data.
See
As shown in
It should be noted that in cases where the video processing apparatus generates no highlight scene data and alternatively uses the highlight scene data which has already been prepared by another apparatus, some of the illustrative components are eliminatable, i.e., the analysis video data input unit 201, feature data generator 202, feature data storage 213, feature data input unit 214, highlight scene data generator 203 and highlight scene data storage 210.
Additionally in case the video processing apparatus is not expected to create the feature data and alternatively uses the feature data that has already been prepared by another apparatus, the analysis video data input unit 201 and feature data generator 202 plus feature data storage 213 are not always necessary. In case it is unnecessary to present the default playback parameter to the user, the default playback parameter presenter 217 is eliminatable.
The analysis video data input unit 201 generates and analyzes the features of video images in order to determine one or several highlight scenes of video data while inputting from the video data input device 100 for production of the feature data and highlight scene data respectively. Note that the analysis video data input unit 201 is rendered operative by the CPU 101 when instructed by the user to prepare such feature data and highlight scene data or upon start-up of playback or when a scheduler (not depicted) finds video data with the feature data and highlight scene data being not yet created.
The feature data generator unit 202 generates features of the video data as input at the analysis video data input unit 201. This is realizable by generation of some factors—e.g., audio power, correlativity, image brightness distribution, and magnitude of motion—in regard to a respective frame of audio data and image data in the video data as shown for example in
Exemplary feature data of audio part is shown in
The brightness distribution 323 is obtainable, for example, by a process having the steps of dividing the image frame of interest into several regions and then providing a histogram of average luminance values in respective regions. The magnitude of movement is realizable for example by a process including dividing such image frame into several regions, generating in each region a motion vector with respect to an immediately preceding frame, and calculating an inner product of respective motion vectors generated. The feature data generator 202 is operated or executed by CPU 101 whenever video data is input upon execution of the analysis video data input unit 201.
The feature data storage 213 retains therein the feature data as generated at the feature data generator 202. This is realizable for example by letting the feature data created by feature data generator 202 be stored in either the storage device 105 or the secondary storage device 106. Additionally the feature data storage 213 may be designed so that upon activation of feature data generator 202, it is executed by CPU 101 whenever the feature data is generated or when a one frame of feature data is generated.
The feature data input unit 214 permits entry of the feature data being presently retained in the feature data storage 213 or the feature data that has already been prepared by another apparatus. This is realizable, for example, by readout of the feature data being stored in the storage device 105 or the secondary storage device 106. This feature data input unit 214 may be executed by CPU 101 upon execution of the highlight scene data generator 203 in a way as will be described later.
The highlight scene data generator 203 is equivalent in functionality to the highlight scene data input/generation means as claimed, which uses the feature data as input by the feature data input unit 214 to determine one or more important or highlight scenes, thereby generating highlight scene data such as shown in
Even when the video data is of the contents other than music TV programs, similar results are obtainable by a process which includes finding the appearance of a typical pattern based on the brightness distribution and/or the movement of a video image, recognizing it as a highlight scene, and detecting this highlight scene.
The highlight scene data generator 203 is executed by CPU 101 when instructed by the user to create highlight scene data, upon startup of reproduction, or when a scheduler (not shown) finds video data with the highlight scene data being not yet prepared.
The highlight scene data storage 210 retains the highlight scene data as generated at the highlight scene data generator 203. This is implemented for example by storing the highlight scene data generated at the highlight scene data generator 203 in either one of the storage device 105 and the secondary storage device 106. Note however that in case the highlight scene data generated at highlight scene data generator 203 is arranged to be directly read into the default parameter determination unit 216 and playback scene determination unit unit 204 in a way as will be described later, the highlight scene data storage 210 is not always required. In case the highlight scene data storage 210 is designed to exist, this storage 210 may be arranged to be executed by CPU 101 when highlight scene data is generated upon execution of the highlight scene data generator 203.
The highlight scene data input unit 211 is equivalent in function to the highlight scene data input/generation means as claimed and is operable to input the highlight scene data being held in the highlight scene data storage 210 or highlight scene data that has already been created by another device. This is realizable for example by readout of the highlight scene data being stored in the storage device 105 or secondary storage device 106. Note here that this highlight scene data input unit 211 is eliminatable in case the highlight scene data as generated at the highlight scene data generator 203 is read directly into the default parameter determination unit 216 and the playback scene determination unit 204. In case system designs permit presence of the highlight scene data input unit 211, this input unit may be arranged to be executed by CPU 101 when the playback scene determination unit 204 or default parameter determination unit 216 is executed in a way as will be discussed later.
The default parameter determination unit 216 corresponds to the default playback parameter determination means as claimed and functions to determine a default playback parameter(s) based on the above-stated highlight scene data. This is realizable by calculation of a total playback time period of the whole video data after having obtained a total sum of respective highlight scene time periods in the highlight scene data. Alternatively, a technique is usable for calculating a ratio of the total playback time of highlight scenes to a playback time of entire video data. More specifically, in case the highlight scene data is the data shown in
The default playback parameter presenter unit 217 is equivalent to the default playback parameter presentation means claimed and is operable to present the user with the playback parameter determined by the default playback parameter determination unit 216. This is realizable for example by causing the playback time or playback ratio calculated by the default playback parameter determination unit 216 to be displayed on the display device 103 via the display unit 208. While various practical examples are conceivable, one example thereof is to display as the default value an input value at the playback scene decision parameter input unit 205 in a way to be later discussed. Exemplary display screens will be described in detail in conjunction with an explanation of the playback scene determination parameter input unit 205. Although the default playback parameter presenter 217 is deemed unnecessary in case no default playback parameters are presented to the user, it is desirable for the user that a time length or playback ratio to be assigned when wanting to effectively watch important scenes is used by default and is presented. In case the default playback parameter presenter 217 is designed to exist, this default playback parameter presenter 217 may be arranged to be executed by CPU 101 after completion of the processing of the above-stated default parameter determination unit 216 upon execution of the playback scene decision parameter input unit 205 in a way to be later discussed.
The playback scene determination parameter input unit 205 is equivalent to a playback scene determination parameter input means and operates to input via the input unit 102 more than one parameter for determination of a playback scene(s). More specifically, for example, it displays window-like display screens shown in
In
In
In
In
In
Alternatively, in case the user pushed down the playback ratio setup button 623, the video processing apparatus goes into a playback ratio appoint mode, enabling the user to set up a desired playback ratio in the play-time/ratio setup area 624.
In this case, an indicator may be displayed near the playback-time/ratio appoint button although not specifically depicted. At this time, an arrangement is employable for displaying, when the playback-time/ratio appoint window 621 appears, the playback time or ratio which is determined by the default parameter determination unit 216 and presented by the default playback parameter presenter 217 in the mode that was set previously.
Thus it becomes possible for the user to readily figure out the playback time or ratio to be appointed when wanting to watch important scenes effectively. Additionally, when either the playback time setup button 622 or the playback ratio setup button 623 is operated by the user resulting in a change in mode, recalculation may be executed to alter the parameter value in a mode before such change to the updated parameter value, which is then displayed in the playback-time/ratio setup window 621.
Also note that the examples of
Furthermore, even after having once input a desired parameter value through the user's manipulation of the default value, it will possibly happen that the user thinks the default value is better than the input value due to the fact that the user changes his or her mind or due to an operation error or else. Supposing the occurrence of such scene, it is very likely that the usability further increases if a mechanism is available for going back to the default value by a simplified operation. Examples of the simple operation are to push down a specified button and to click with a certain region (including an icon indicative of the “Default Value”).
In this case, a control signal for instruction of output of the default value is input to the CPU 101 by the above-stated operation. In responding thereto, CPU 101 executes the processing for visualization of a display screen on the remote control or at the display device 103 by way of the display unit 208. Whereby, it is expected to further improve the usability.
The playback scene determination unit 204 corresponds to the playback scene determination means claimed, and operates to determine playback scenes based on the parameter as input at the playback scene decision parameter input unit 205 and the highlight scene data that was generated by the highlight scene data generator 203 or input by the highlight scene data input unit 211. More specifically, for example, in case the highlight scene data is the data shown in
In
In
In another exemplary case where the highlight scene data is the one shown in
It is not always required to set it as the first-half part; for example, either the second-half part or a center-containing half portion is alternatively employable. Still alternatively, any half part is usable which involves an audio power-maximal point or a specific image portion on the image or a half part with this point as its front end. A further alternative example for use as the playback scene is an ensemble of portions of a prespecified length as extracted from respective scenes; in the above-noted example, what is required is to shorten the entire highlight scenes by 40 seconds in total, so a portion of 40÷3≈13.4 seconds is cut from each highlight scene for use as the playback scene. In this case, the remaining portions which are out of such cutting and used as playback scenes may also be arranged to contain the first- or second-half part of highlight scene or a central part thereof or, alternatively, contain an audio power-maximized point or specific image point on the image; still alternatively, this point may be designed so that its front end becomes a playback scene.
Note that
In
In
Practically, for example, determine as each playback scene a scene which contains each highlight scene with its head and tail portions extended as shown in
Note that
In addition, 802 indicates the start position of such playback scene whereas 803 denotes the end position thereof. It is noted that the start and end positions may be set to a start time and an end time, respectively: in this embodiment, an explanation will be given while assuming that the start and end positions of a playback scene are the start and end time points, respectively, for convenience in discussion herein.
In
Incidentally, the playback scene determination unit 204 is rendered operative by the CPU 101 after input of a playback parameter at the playback scene decision parameter input unit 205 or when it is assigned that the default value is acceptable.
The playback motion-picture data input unit 212 corresponds to the motion data input means as claimed and is operable to input from the video data input device 100 the video data to be reproduced. This playback video data input unit 212 gets started upon acquisition of the to-be-reproduced video data by the playback unit 206 in a way as will be discussed later and is then executed by CPU 101.
The display unit 208 is equivalent in function to the display means claimed and operates to visually display the playback images produced by the playback unit 206. This display unit 208 displays the playback images on the screen of display device 103 on a per-frame basis. In this case, the display unit 208 is activated by playback unit 206 whenever a one frame of playback image is generated by playback unit 206, and executed by CPU 101. Optionally this may be designed to display any one of the pop-up windows shown in
The audio output unit 215 is also equivalent to the claimed display means and functions to display at the audio output device 104 the playback sounds and voice as produced at the playback unit 206. This audio output unit 215 is realizable in a way that the playback sound/voice produced by playback unit 206 is output to the audio output device 104 in units of frames. In this case the audio output unit 215 is activated and executed by CPU 101, once at a time, whenever a one frame of playback sound/voice is created by playback unit 206.
The playback unit 206 corresponds to the playback means and inputs the video data of a playback scene or scenes determined by the playback scene determination unit 204 via the playback motion-picture data input unit 212 and then generates playback images, which are displayed at the display device 103 by way of display unit 208. In addition, it produces playback audio components, which are output to the audio output unit 215. Details of the processing contents in playback unit 206 will be set forth later together with an entire operation. The playback unit 206 is executed by CPU 101 in case normal playback or highlight scene reproduction is instructed by the user.
Next, one example of the playback operation panel of the video processing apparatus will be described while referring to
In
As previously stated, the illustrative video processing apparatus comes with the highlight scene playback instruction button 508. The user is allowed via operation of this button 508 to give instructions as to highlight scene playback startup or highlight scene playback completion with respect to the video data chosen by operation of the video data selector button 502. This is arranged for example in such a way as to perform startup of highlight scene playback upon single pressing of the highlight scene playback instruction button 508 and complete the highlight scene playback and then return to normal reproduction when the same button is pushed once again. An operation at this time will be described later in conjunction with the entire operation of the video processing apparatus along with detailed processing contents of the playback unit 206.
The highlight scene playback indicator 509 may be designed to illuminate during reproduction of highlight scenes.
Respective buttons on the playback operation panel 501 may be arranged by physical buttons on the remote control or may alternatively be overlaid on the display device 103 via the display unit 208 after the image framing was done by CPU 101. If this is the case, the playback time or playback ratio as input by the playback scene decision parameter input unit 205 may be displayed in vicinity of the highlight scene playback instruction button 508 as indicated by 510 in
In case the remote control has its own display panel thereon, the playback time or playback ratio as input by the playback scene decision parameter input unit 205 may be displayed on this display panel. In such case, the remote control may be designed for example to acquire, when the highlight scene playback instruction button 508 is pressed resulting in entry of an instruction to start playback of highlight scenes, the playback time or playback ratio as input by the playback scene decision parameter input unit 205 in association with the video processing apparatus by access using infrared rays.
Next, an entire operation of the video processing apparatus along with the playback processing contents at the playback unit 206 will be discussed with reference to a flowchart of
As shown in
Firstly the playback unit 206 determines whether the highlight scene playback is instructed (at step 1001).
If the decision at step 1001 affirms that such highlight scene playback is not instructed yet, then perform normal reproduction (at step 1002). An explanation of the normal playback is eliminated as it has widely been carried out in the art. In the video processing apparatus embodying the invention, a decision as to whether the highlight scene playback is instructed or not is made by judging at regular intervals whether the highlight scene playback instruction button 508 is pressed (at step 1003). In case a present playback session is ended without receipt of any highlight scene playback instruction (at step 1004), terminate the playback. In ordinary reproduction, when completing display of the whole video data or when playback ending is instructed from the user, determine as the end of the playback; otherwise, continue execution of the ordinary playback operation.
When it is determined that highlight scene playback is assigned as a result of the decision at the step 1001, the highlight scene playback is carried out in a way which follows. First, receive highlight scene data as input by the highlight scene data input unit 211 (at step 1005). If the highlight scene data is absent, then activate relevant units—e.g., the analysis video data input unit 201, feature data generator 202, feature data storage 213, feature data input unit 214, highlight scene data generator 203, and highlight scene data storage 210—for production of highlight scene data or, alternatively, perform ordinary playback while displaying a message saying that no highlight scene data is found. An alternative arrangement is that when the highlight scene data is absent, the highlight scene playback instruction button 508 is invalidated; still alternatively, in case the highlight scene playback instruction button 508 is designed to be displayed on the display screen, disable the displaying of this button 508.
In case the highlight scene data can be input successfully, the playback unit 206 then causes the default parameter determination unit 216 to calculate the default playback parameter. When the default playback parameter presenter 217 exists, display the default playback parameter calculated (at step 1006).
Subsequently, the playback scene decision parameter input unit 205 inputs the playback parameter (at step 1007), followed by determination of playback scenes by the playback scene determination unit 204 (step 1008).
Then, acquire a present playback position in the video data (at step 1009). Based on this present playback position, acquire the start position and end position of another playback scene next thereto (step 1010). This is realizable by acquisition of the start and end positions of a playback scene out of the playback scenes determined by the playback scene determination unit 204, which is behind the present playback position and is closest thereto.
Next, the playback unit 206 jumps (at step 1011) to the start position of the next playback scene as acquired at the step 1010, and then performs reproduction of this playback scene (step 1012). This is achieved by displaying a video image in the playback scene on the display device 103 via the display unit 208 and also outputting playback sounds and voices in the playback scene to the audio output device 104 by way of the audio output unit 206.
Additionally, determine at regular intervals whether the highlight scene playback instruction button 508 is pushed down or alternatively whether the playback button 503 is depressed during reproduction of this playback scene, thereby deciding whether the ordinary playback is designated (at step 1013). If such ordinary playback is assigned then go to the ordinary playback of steps 1002 to 1004.
During reproduction of the playback scene, an attempt is made at regular intervals to judge whether the playback is completed (at step 1014). If the reproduction is over then terminate the reproduction of the video data. Note here that in the process of reproducing the highlight scenes, when having completed every playback scene determined by the playback scene determination unit 204 or when instructed by the user to terminate the playback operation, it is determined to end the playback; otherwise, continue reproducing playback scenes. Furthermore, during the playback scene reproduction, an attempt is made at fixed intervals to judge whether the playback parameter is modified (at step 1015). If the playback parameter is changed then return to step 1005.
If the playback parameter is kept unchanged, then subsequently acquire a present playback position (at step 1016) and determine whether it reaches the end position of the playback scene (step 1017). This is determinable by comparing the end position of the playback scene acquired at the step 1010 to the present playback position obtained at the step 1016.
In case a result of the decision at step 1017 indicates that the present playback position does not yet reach the end position of the playback scene, repeat the processes of steps 1012 to 1017 to thereby continue the playback scene reproduction. Alternatively, if the decision result at step 1017 reveals that it has reached the end position of the playback scene, then repeat the steps 1009 to 1017 to thereby sequentially reproduce those playback scenes determined by the playback scene determination unit 204. Upon completion of all the playback scenes determined by playback scene determination unit 204, recognize it at step 1014, followed by termination of the reproduction.
With this procedure, as shown in
In
Although in this embodiment the explanation was given as to one specific case where a present playback position is prior to the start position of initial playback scene, practical applicability is also available in cases where such present playback position is behind the start positions of several playback scenes. In this case, a technique may be used for inhibiting reproduction of any playback scene before the present position or for excluding it from the objects to be processed stated supra. Whereby, there are dynamically enabled the default playback parameter determination and presentation by the default parameter determination unit 216 and default playback parameter presenter 217, the playback parameter entry by the playback scene decision parameter input unit 205, and the playback scene decision by the playback scene determination unit 204.
In an embodiment 2, a video processing apparatus is provided, which performs ranking (grading) of scenes in the video or video data and then determines based thereon appropriate highlight scenes and playback scenes.
As shown in
The ranking data generator 1501 is equivalent in functionality to the ranking data input/generation means as claimed and is responsive to receipt of the feature data as input at the feature data input unit 214, for performing ranking of scenes in video data to thereby generate ranking data such as shown in
Alternatively, even when the video data has its contents other than music TV programs, similar results are also obtainable in such a way that when a typical scene appears, heighten the rank of such scene based on either the brightness distribution or the movement of video image, for example. Obviously, the intended scene ranking is attainable by using these methods in combination.
The ranking data generator 1501 is rendered operative by CPU 101 when preparation of ranking data is instructed by the user or when reproduction gets started or when a scheduler (not shown) detects certain video data with its ranking data being not yet prepared.
The ranking data retainer 1502 holds therein the ranking data generated at the ranking data generator 1501. This is realizable by letting the ranking data generator 1501's output ranking data be stored in the storage device 105 or the secondary storage device 106.
This ranking data retainer 1502 is not always necessary in case an arrangement is used for permitting the ranking data generated by the ranking data generator 1501 to be directly read into the highlight scene data generator 203. In case the ranking data retainer 1502 is arranged to exist, this retainer 1502 may be arranged to be executed by CPU 101 whenever the ranking data is created during operation of the ranking data generator 1501.
The ranking data input unit 1503 corresponds to the ranking data input/generation means as claimed and operates to input either the ranking data retained in the ranking data retainer 1502 or the ranking data as created in advance by another device or apparatus. This may be realized for example by readout of the ranking data being stored in the storage device 105 or secondary storage device 106. In case an arrangement is used which permits the ranking data generator 1501's output ranking data to be directly read into the highlight scene data generator 203, this ranking data input unit 1503 is eliminatable. In case the ranking data input unit 1503 is designed to exist, this input unit 1503 is arranged to be executed by CPU 101 when the highlight scene data generator 203 is activated.
In this embodiment 2, the processing of the analysis video data input unit 201, feature data input unit 214, highlight scene data generator 203 and playback scene determination unit 204 will be modified in a way which follows.
The analysis video data input unit 201 generates and analyzes video image features in order to perform the ranking of scenes in video data and determine a highlight scene(s) while inputting from the video data input device 100 in order to generate the feature data and the ranking data plus the highlight scene data. This analysis video data input unit 201 is rendered operative by the CPU 101 when instructed by the user to prepare the feature data, ranking data or highlight scene data, upon startup of reproduction, or when a scheduler (not shown) finds certain video data without preparation of the feature data, ranking data or highlight scene data.
The feature data input unit 214 permits entry of the feature data as held in the feature data storage 213 or the feature data as has been already generated by another apparatus or device. This is realizable, for example, by readout of the feature data being stored in the storage device 105 or the secondary storage device 106. Additionally the feature data input unit 214 may be executed by CPU 101 upon activation of the ranking data generator 1501 or the highlight scene data generator 203.
The highlight scene data generator 203 uses the feature data as input at the feature data input unit 214 and the ranking data generated at the ranking data generator 1501 to determine highlight scenes and then generates highlight scene data such as shown in
The determination of highlight scenes in this highlight scene data generator 203 is achievable for example by using audio portions in the ranking data in case the video data has the contents of a music TV program. Even when its contents are other than the music program, similar results are also obtainable by extraction of a scene with appearance of a typical pattern based on the luminance distribution and/or movement of video image in the ranking data by way of example. Alternative examples include, but not limited to, a scene with its audio pattern being greater than or equal to a specified level in the ranking data, a scene with its luminance more than or equal to a specified level in the ranking data, a specific scene having a prespecified luminance distribution in the ranking data, and any given upper-level scene in the ranking data.
In
The playback scene determination unit 204 determines one or some playback scenes based on the parameter as input by the playback scene decision parameter input unit 205 and the ranking data generated by the ranking data generator 1501 or entered at the ranking data input unit 1503 plus the highlight scene data generated by the highlight scene data generator 203. Practically, in an exemplary case where the ranking data for video data of 500 seconds is the data shown in
In
In
In
Alternatively, in case the highlight scene data of video data with its time length of 500 seconds is the data shown in
Practically, for example, in the above-stated example, high-rank scenes with a total time length of 40 seconds are selected as the playback scenes in the way shown in
In
In addition, 1602 indicates the start position of such playback scene while 1603 is the end position of it. Optionally the start and end positions may be replaced by a start time and an end time respectively. In this embodiment, an explanation will be given under an assumption that the start and end positions of playback scene are the start and end time points respectively, for purposes of convenience in discussion. Additionally in
As apparent from
Practically, for example, in the above-stated example, select as the playback scenes some scenes which are higher in rank and a total time length of which is 120 seconds as shown in
In
A scene 1608 is also the playback scene, and is a part of the rank-5 scene. Numeral 1602 denotes the start position of such playback scene, and 1603 is the end position thereof. The start and end positions may be replaced by a start time and an end time respectively. In this embodiment, an explanation will be given while assuming that the start and end positions of playback scene are the start and end time points respectively, for purposes of convenience only. Additionally in
It can be seen from
This embodiment 2 is further arranged to activate, when the highlight scene data is absent at the step 1005 in
Although in the embodiments 1 and 2 the highlight scene data generator 203 and playback scene determination unit 204 are designed to perform fixed processing irrespective of the category of video data, the processing may be modified to switch between the methods shown in the embodiments 1 and 2 in compliance with the video data category.
In this case, as shown in
Regarding the playback scene determination unit 204 also, this is designed to determine a sequence of playback scenes by a predetermined method which is either one of the methods shown in the embodiments 1 and 2 in accordance with the video data category obtained by the category acquisitor 2001. Thus it becomes possible to effectively perform reproduction of highlight scenes in a way pursuant to the category of the video data.
This invention should not exclusively be limited the above-stated embodiments and may be implemented while being modified without departing from the scope of the invention. Also note that the embodiments involve various inventive contributions, and various inventive features are extractable by any adequate combinations of a plurality of constituent components disclosed herein. For example, even when one or several components are omitted from such components shown in the embodiments, the intended objective as set forth in the description is attainable. It would readily occur to those skilled in the art that in cases where the effects and advantages stated supra are obtained, such configuration with the components eliminated should be interpreted to fall within the scope of coverage of the invention.
It should be further understood by those skilled in the art that although the foregoing description has been made on embodiments of the invention, the invention is not limited thereto and various changes and modifications may be made without departing from the spirit of the invention and the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2005-120484 | Apr 2005 | JP | national |