Digital graphic design, image editing, audio editing, and video editing applications (hereafter collectively referred to as media content editing applications or media editing applications) provide graphical designs, media artists, and other users with the necessary tools to create a variety of media content. Examples of such applications include Final Cut Pro® and iMovie®, both sold by Apple Inc. These applications give users the ability to edit, combine, transition, overlay and piece together different media content in a variety of manners to create a resulting media project. The resulting media project specifies a particular sequenced composition of any number of text, audio clips, images, or video content that is used to create a media presentation.
Various media editing applications facilitate such compositions through electronic means. Specifically, a computer or other electronic device with a processor and a computer readable storage medium executes the media content editing applications. In so doing, the computer generates a graphical interface whereby designers digitally manipulate graphical representation of the media content to produce a desired result.
One difficulty in media editing is that a user cannot easily and intuitively alter the timing of media clips in the graphical interface. For example, the user may wish to graphically specify that media content within a particular range to be played back at a particular playback speed (e.g., slow motion or accelerated motion). The user may also wish to apply other speed or timing effects (e.g., instant replay or rewind) to the particular range of media content.
Some existing media editing applications facilitate the application of speed or timing effect by providing a playback curve. A playback curve is an abstract representation of a media content that specifies the relationship between the media content and the playback time. A user can graphically manipulate the playback curve in order to adjust the playback timing of the media content. Unfortunately, such a timing adjustment is based on manipulations of an abstract representation of the media content that does not intuitively relate to the user what has happened to the media content. Worse yet, allowing direct user manipulation of the playback curve in some instances can cause unintended visual effects (such as playback speed overshoot).
Some embodiments of the invention provide a novel method for retiming a portion of a media content (e.g., audio data, video data, audio and video data, etc.) in a media-editing application. In some embodiments, the media editing application includes a user interface for defining a range in order to select a portion of the media content. The media editing application performs retiming by applying a speed effect to the portion of the media content selected by the defined range. For a faster speed effect, the media editing application retimes the selected portion of the media content by sampling the media content at a faster rate. For a slower speed effect, the media editing application retimes the selected portion of the media content by sampling the content at a slower rate.
The media-editing application of some embodiments provides preset speed effects so a user can quickly achieve an initial retiming effect on a selected range in the composite presentation. The initial retiming effect can then be used as a template for further adjustment and refinement by the user for desired result.
In order to perform retiming operations, some embodiments of the media editing application maintain a playback curve for adjusting the playback speed of the composite presentation. The playback curve is for mapping each video frame that needs to be played back at a particular instant in time to one or more video frames in the source media clips. In some embodiments, the same playback curve is also used to map audio playback such that the slope of the playback curve at a particular instant in time corresponds to the audio playback speed at that particular instant in time.
Each retiming operation is implemented based on adjustments of the playback curve. In some embodiments, the playback curve is entirely controlled and maintained by the media editing application and cannot be directly manipulated by the user. After a retiming operation, some embodiments perform a curve smoothing operation that avoids overshooting and maintains monotonicity between keyframes.
Some embodiments of the media editing application supports anchored clips that are anchored to a particular video frame during playback. Some embodiments ensure that the anchored clip remains anchored to the correct video frame after the retiming operation. Some of these embodiments map the anchored frame to an anchor media time TA, and then use the anchor media time TA to map to the correct new playback time after the retiming operation. Some embodiments map the anchor media time TA to a unique anchor playback time for the anchored clip by marking a section of the playback curve defined by two keyframes as being associated with the anchored clip.
The preceding Summary is intended to serve as a brief introduction to some embodiments of the invention. It is not meant to be an introduction or overview of all inventive subject matter disclosed in this document. The Detailed Description that follows and the Drawings that are referred to in the Detailed Description will further describe the embodiments described in the Summary as well as other embodiments. Accordingly, to understand all the embodiments described by this document, a full review of the Summary, Detailed Description and the Drawings is needed. Moreover, the claimed subject matters are not to be limited by the illustrative details in the Summary, Detailed Description and the Drawing, but rather are to be defined by the appended claims, because the claimed subject matters can be embodied in other specific forms without departing from the spirit of the subject matters.
The novel features of the invention are set forth in the appended claims. However, for purpose of explanation, several embodiments of the invention are set forth in the following figures.
In the following description, numerous details are set forth for the purpose of explanation. However, one of ordinary skill in the art will realize that the invention may be practiced without the use of these specific details. In other instances, well-known structures and devices are shown in block diagram form in order not to obscure the description of the invention with unnecessary detail.
Some embodiments of the invention provide a novel method for retiming a portion of a media content (e.g., audio data, video data, audio and video data, etc.) in a media-editing application. In some embodiments, the media editing application includes a user interface for defining a range in order to select the portion of the media content. The media editing application then performs retiming by applying a speed effect to the portion of the media content selected by the defined range. For a faster speed effect, the media editing application retimes the selected portion of the media content by sampling the media content at a faster rate. For a slower speed effect, the media editing application retimes the selected portion of the media content by sampling the content at a slower rate.
A media clip in some embodiments is a piece of media content. Examples of types of media content include audio data, video data, audio and video data, text data, image/picture data, and/or other media data. In some embodiments, a media clip can be a video clip or an audio clip. In other embodiments, a media clip can be a video clip, an audio clip, an audio and video clip, a sequence of media clips (also referred to as a media clip sequence), a text clip, a text overlay, a still image or picture, or any other type of media clip that can be used to create a composite presentation. In this application, a media clip may also refer to the graphical representation of the media clip in the GUI of a media-editing application of some embodiments.
For some embodiments,
The media library 110 is an area in the GUI 100 through which the application's user can select media clips (video or audio) to add to a media presentation that the user is compositing with the application. In the example of
The timeline 130 provides a visual representation of a composite presentation being created by the user. In some embodiments, a composite presentation in the timeline 130 includes one or more containers of media clips. Media clips such as video and/or audio clips can be brought into one of the containers in the timeline 130 from the media library 120 for creating the composite presentation.
The timeline 130 includes a central compositing lane 150 that includes clips 152 (clip A) and 154 (clip B). The inclusion of clips A and B are graphically indicated by graphical representation of the clips in the central compositing lane 150. A clip can contain a single piece of media content from a single source. A clip can also be a compound clip that includes several pieces of media content from multiple sources. A clip in the timeline is therefore referred to as a media container in some embodiments. A central compositing lane in some embodiments is the main track of the composite presentation, upon which other video clips and audio clips can be overlaid. In some embodiments, the timeline 130 has only one track and the central compositing lane 150 is that only track of the timeline. In some other embodiments, the timeline has multiple tracks and the central compositing lane 150 is one of the tracks.
Clip 152 includes an effect bar 160 that indicates the status of an effect being applied to the clip A. Clip 154 includes an effect bar 170 that indicates the status of an effect being applied to clip B. In the example of
The effects menu 140 provides a menu of effects that can be selected and applied to the composite presentation. One of ordinary skill in the art would recognize that the effects menu can be implemented in the GUI 100 as a fixed panel or as a pop menu that appears only when specifically invoked by the user (e.g., by a mouse click or a selection of a particular GUI item). The effects menu 140 includes speed effects such as “slower”, “faster”, “ramp” and “hold”, which are described in more detail further below. In some embodiments, the effects menu 140 also includes other speed effects such as “instant replay”, “rewind”, and “conform speed” ”, which are described in more detail further below. In addition to speed effects that affect the playback time of the composite presentation, the effects menu in some embodiments also includes one or more other visual or audio effects that do not affect the playback time of the composite presentation. In some embodiments, the effects in the effects menu 140 are “preset” effects. A preset effect is an effect that, once selected by the user, is automatically applied to a range of media contents. A user can then use the GUI to further manipulate the resultant composite presentation and adjust the applied speed effect.
In some embodiments, operations of the media editing application that changes the timing relationship between playback and content (such as applying a preset speed effects or adjusting a previously applied speed effects) are performed by a retiming engine of the media editing application. In some embodiments, the media editing application translates selection of preset speed effect and/or user adjustment of speed effect of a media clip into one or more retiming commands for the retiming engine, which generates, maintains, and adjusts a playback curve for the media clips according to the retiming command. Retiming engine or retiming module will be further described by reference to
The playback activation item 122 is a conceptual illustration of one or more UI items that allow the media editing application to activate its video and audio playback. The retiming tool activation item 124 is a conceptual illustration of one or more UI items that allow the media editing application to activate its retiming tool. Different embodiments of the invention implement these UI items differently. Some embodiments implement them as a selectable UI button, others as a command that can be selected in a pull-down or drop-down menu, and still others as a command that can be selected through one or more keystroke operations. Accordingly, the selection of the playback activation item 122 and retiming tool activation item 124 may be received from a cursor controller (e.g., a mouse, touchpad, trackball, etc.), from a touchscreen (e.g., a user touching a UI item on a touchscreen), or from a keyboard input (e.g., a hotkey or a key sequence), etc. Yet other embodiments allow the user to access the retiming tool feature through two or more of such UI implementations or other UI implementations.
In order to perform retiming operations, some embodiments of the media editing application maintain a playback curve for adjusting the playback speed of the composite presentation. The playback curve is for mapping each video frame that needs to be played back at a particular instant in time to one or more video frame in the source media clips. In some embodiments, the same playback curve is also used to map audio playback such that the slope of the playback curve at a particular instant in time corresponds to the audio playback speed at that particular instant in time. Each retiming operation is implemented based on adjustments of the playback curve. In some embodiments, the playback curve is entirely controlled and maintained by the media editing application and cannot be directly manipulated by the user.
The six stages 101-106 of the retiming operation of
The first stage 101 of
The second stage 102 of
The third stage 103 of
The fourth stage 104 of
The second section 162 corresponds to the selected range 180 after the application of the “slower” preset speed effect. The duration of second section 162 (t′3-t2) is longer than the duration of the selected range 180 (t3-t2) because the portion of the second section 162 is being played back at 50% of the normal speed (thus the duration of the section is twice as long as before the speed effect). The section 162 of the effects bar is also marked with a visual indication (i.e., diagonal hash pattern) to indicate that this section is to be played back at a speed slower than normal. In some embodiments, each section of the effects bar is associated with a visual indication of the effect being applied. For example, some embodiments color code each section of the effect bar according to the speed of the playback (e.g., green for normal speed, orange for speed slower than normal, blue for speed faster than normal, and red for stoppage or pause during playback.) Some of these embodiments use different color intensity levels to indicate different levels of speed. Some embodiments use different patterns and or different texts on the effect bar to provide visual indications of effects being applied.
The application of the “slower” speed effect corresponds to stage 204 of
The fourth stage 104 of
The fifth stage 105 of
The last stage 106 of
The application of the “faster” speed effect corresponds to stage 205 of
In some embodiments, the media application performs a curve smoothing operation after a retiming operation. As illustrated in
In some embodiments, media clips do not necessarily start at time 0 (e.g., a media clip can start at 1 minute after time 0). In these instances, the retiming operations that change playback speeds of specific sections (such as the “faster” and “slower” speed effect presets discussed above) perform the retiming operation by pivoting on time 0 rather than the actual start time of the media clip. For example, if a media clip starts at 5 s and ends at 15 s, a retiming operation that slows the entire media clip to 50% playback speed would change the start time of the media clip to 10 s and the end time to 30 s (instead of leaving the start time at 5 s and changing the end time to 25 s.)
For some embodiments,
The process 300 next receives (at 330) a retiming command that specifies a set of retiming parameters. In the example of
Next, the process updates (at 340) the playback curve of the clip or set of clips in the media clip according to the retiming command and associated parameters. In the example of
The process next performs (at 350) curve-smoothing operation on the updated playback curve. The curve-smoothing operations make changes to the curve to minimize or eliminate playback speed discontinuities. After performing the curve smoothing operation, the process 300 ends.
A more detailed view of a media editing application with these features is illustrated in
The clip library 405 includes a set of folders through which a user accesses media clips that have been imported into the media-editing application. Some embodiments organize the media clips according to the device (e.g., physical storage device such as an internal or external hard drive, virtual storage device such as a hard drive partition, etc.) on which the media represented by the clips are stored. Some embodiments also enable the user to organize the media clips based on the date the media represented by the clips was created (e.g., recorded by a camera). As shown, the clip library 405 includes media clips from both 2009 and 2011.
Within a storage device and/or date, users may group the media clips into “events”, or organized folders of media clips. For instance, a user might give the events descriptive names that indicate what media is stored in the event (e.g., the “New Event 2-8-09” event shown in clip library 405 might be renamed “European Vacation” as a descriptor of the content). In some embodiments, the media files corresponding to these clips are stored in a file storage structure that mirrors the folders shown in the clip library.
Within the clip library, some embodiments enable a user to perform various clip management actions. These clip management actions may include moving clips between events, creating new events, merging two events together, duplicating events (which, in some embodiments, creates a duplicate copy of the media to which the clips in the event correspond), deleting events, etc. In addition, some embodiments allow a user to create sub-folders of an event. These sub-folders may include media clips filtered based on tags (e.g., keyword tags). For instance, in the “New Event 2-8-09” event, all media clips showing children might be tagged by the user with a “kids” keyword, and then these particular media clips could be displayed in a sub-folder of the wedding event that filters clips in this event to only display media clips tagged with the “kids” keyword.
The clip browser 410 allows the user to view clips from a selected folder (e.g., an event, a sub-folder, etc.) of the clip library 405. As shown in this example, the folder “New Event 2-8-11 3” is selected in the clip library 405, and the clips belonging to that folder are displayed in the clip browser 410. Some embodiments display the clips as thumbnail filmstrips, as shown in this example. By moving a cursor (or a finger on a touchscreen) over one of the thumbnails (e.g., with a mouse, a touchpad, a touchscreen, etc.), the user can skim through the clip. That is, when the user places the cursor at a particular horizontal location within the thumbnail filmstrip, the media-editing application associates that horizontal location with a time in the associated media file, and displays the image from the media file for that time. In addition, the user can command the application to play back the media file in the thumbnail filmstrip.
In addition, the thumbnails for the clips in the browser display an audio waveform underneath the clip that represents the audio of the media file. In some embodiments, as a user skims through or plays back the thumbnail filmstrip, the audio plays as well.
Many of the features of the clip browser are user-modifiable. For instance, in some embodiments, the user can modify one or more of the thumbnail size, the percentage of the thumbnail occupied by the audio waveform, whether audio plays back when the user skims through the media files, etc. In addition, some embodiments enable the user to view the clips in the clip browser in a list view. In this view, the clips are presented as a list (e.g., with clip name, duration, etc.). Some embodiments also display a selected clip from the list in a filmstrip view at the top of the browser so that the user can skim through or playback the selected clip.
The timeline 415 provides a visual representation of a composite presentation (or project) being created by the user of the media-editing application. Specifically, it displays one or more geometric shapes that represent one or more media clips that are part of the composite presentation. The timeline 415 of some embodiments includes a primary lane 440 (also called a “spine”, “primary compositing lane”, or “central compositing lane”) as well as one or more secondary lanes 445 (also called “anchor lanes”). The spine represents a primary sequence of media which, in some embodiments, does not have any gaps. The clips in the anchor lanes are anchored to a particular position along the spine (or along a different anchor lane). Anchor lanes may be used for compositing (e.g., removing portions of one video and showing a different video in those portions), B-roll cuts (i.e., cutting away from the primary video to a different video whose clip is in the anchor lane), audio clips, or other composite presentation techniques.
The user can add media clips from the clip browser 410 into the timeline 415 in order to add the clip to a presentation represented in the timeline. Within the timeline, the user can perform further edits to the media clips (e.g., move the clips around, split the clips, trim the clips, apply effects to the clips, etc.). The length (i.e., horizontal expanse) of a clip in the timeline is a function of the length of media represented by the clip. As the timeline is broken into increments of time, a media clip occupies a particular length of time in the timeline. As shown, in some embodiments the clips within the timeline are shown as a series of images. The number of images displayed for a clip varies depending on the length of the clip in the timeline, as well as the size of the clips (as the aspect ratio of each image will stay constant).
As with the clips in the clip browser, the user can skim through the timeline or play back the timeline (either a portion of the timeline or the entire timeline). In some embodiments, the playback (or skimming) is not shown in the timeline clips, but rather in the preview display area 420.
The preview display area 420 (also referred to as a “viewer” displays images from media files that the user is skimming through, playing back, or editing. These images may be from a composite presentation in the timeline 415 or from a media clip in the clip browser 410. In this example, the user has been skimming through the beginning of clip 440, and therefore an image from the start of this media file is displayed in the preview display area 420. As shown, some embodiments will display the images as large as possible within the display area while maintaining the aspect ratio of the image.
The inspector display area 425 displays detailed properties about a selected item and allows a user to modify some or all of these properties. The selected item might be a clip, a composite presentation, an effect, etc. In this case, the clip that is shown in the preview display area 420 is also selected, and thus the inspector displays information about media clip 440. This information includes duration, file format, file location, frame rate, date created, audio information, etc. about the selected media clip. In some embodiments, different information is displayed depending on the type of item selected.
The additional media display area 430 displays various types of additional media, such as video effects, transitions, still images, titles, audio effects, standard audio clips, etc. In some embodiments, the set of effects is represented by a set of selectable UI items, each selectable UI item representing a particular effect. In some embodiments, each selectable UI item also includes a thumbnail image with the particular effect applied. The display area 430 is currently displaying a set of effects for the user to apply to a clip. In this example, only two effects are shown in the display area (the keyer effect and the luma keyer effect, because the user has typed the word “keyer” into a search box for the effects display area).
The toolbar 435 includes various selectable items for editing, modifying what is displayed in one or more display areas, etc. The right side of the toolbar includes various selectable items for modifying what type of media is displayed in the additional media display area 430. The illustrated toolbar 435 includes items for video effects, visual transitions between media clips, photos, titles, generators and backgrounds, etc. In addition, the toolbar 435 includes an inspector selectable item that causes the display of the inspector display area 425 as well as items for applying a retiming operation to a portion of the timeline, adjusting color, and other functions. In some embodiments, selecting the retiming tool activation item 475 invokes a speed effects menu that includes one or more selectable retiming operation presets.
The left side of the toolbar 435 includes selectable items for media management and editing. Selectable items are provided for adding clips from the clip browser 410 to the timeline 415. In some embodiments, different selectable items may be used to add a clip to the end of the spine, add a clip at a selected point in the spine (e.g., at the location of a playhead), add an anchored clip at the selected point, perform various trim operations on the media clips in the timeline, etc. The media management tools of some embodiments allow a user to mark selected clips as favorites, among other options.
One or ordinary skill will also recognize that the set of display areas shown in the GUI 400 is one of many possible configurations for the GUI of some embodiments. For instance, in some embodiments, the presence or absence of many of the display areas can be toggled through the GUI (e.g., the inspector display area 425, additional media display area 430, and clip library 405). In addition, some embodiments allow the user to modify the size of the various display areas within the UI. For instance, when the additional media display area 430 is removed, the timeline 415 can increase in size to include that area. Similarly, the preview display area 420 increases in size when the inspector display area 425 is removed.
Several more detailed embodiments of the invention are described below. Section I describes other retiming operations performed by other speed effect presets. Section II describes retiming operations performed by user manipulations. Section III describes in further detail the mapping of playback time using the playback curve. Section IV describes the interaction between retiming operations and anchor clips. Section V describes a media editing application that performs retiming. Finally, Section VI describes an electronic system with which some embodiments of the invention are implemented.
I. Speed Effect Presets
In some embodiments, the media editing application provides preset speed effects so a user can quickly achieve an initial retiming effect on a selected range in the composite presentation. The initial retiming effect can then be used as a template for further adjustments and refinements by the user for desired result.
A “ramp” operation is a retiming operation that automatically divides a selected range of a clip or a set of clips in a media clip of a timeline into multiple sections of increasing or decreasing playback speed. For some embodiments,
The second stage 502 illustrates the selection of a ramp retiming operation from an effects menu 540. Specifically, the selection specifies that the ramp operation gradually decreases the speed of the selected range 530 toward 0% of normal playback speed. The effects menu 540 also includes other options for the ramp retiming operation. For example, the user can select to gradually increase playback speed toward 200% of normal playback speed.
The third stage 503 illustrates the result of the ramp retiming operation. The effects bar 520 and the media clip 510 have been partitioned into seven different sections 521-527. Sections 521 and 527 correspond to portions of the media clip 510 that falls outside of the range 530 and thus remain at 100% of normal playback speed. Section 527 starts at a playback time t′1 instead of t1 because the selected ramp retiming operation slows down playback speed and increases playback time. Sections 522-526 are assigned playback speed at 87%, 62%, 38%, 13% and 0% respectively. To complete the speed ramp toward 0%, some embodiments include the 0% playback speed section 526. In some of these embodiments, the 0% playback speed portion of the speed ramp is shorter than other sections (522-525) in the ramp.
One of ordinary skill in the art would recognize that many different possible implementations of the ramp retiming operation is possible than what is illustrated in
Some embodiments of the media editing application include speed effects that cause playback to pause or hold at a particular video frame.
The timeline includes a playhead 725 for indicating where in the timeline is currently being displayed at the preview display area 720. The timeline 730 also includes a media clip 750 of video and audio clips. The media clip 750 includes an effects bar 760 that indicates the playback speed of the media clip 750. Unlike the media clip 150 of
The thumbnails are miniaturized versions of video frames sampled at regular intervals of the playback time from the media content in the media clip 750. The thumbnails display a sequence of images that correspond to the playback speed of the media clip 750. For purpose of illustration, each thumbnail is marked with an index corresponding to the position of the thumbnail image in the media time. For example, a thumbnail image sampled at media time 1 has an index of 1 and a thumbnail image sampled at media time 2.6 has an index of 2.6. One of ordinary skill in the art would realize that such indices are for the purpose of illustration only, and that some embodiments do not display such indices in the thumbnail images. In the example of
At the first stage 701, the entirety of the media clip 750 is at normal playback speed (100%) as indicated by the effects bar 760. The preview display area 720 is displaying a video image with an index of 1, since the playhead 725 is at a position near the start of the thumbnail 751 with index 1. The first stage 701 also shows the cursor 790 placed over the retiming activation item 724 for activating retiming operations.
The second stage 702 shows the selection of a range 770 after the activation of the retiming operations (indicated by the highlighting of the item 724). The second stage 702 also illustrates a selection of a “hold” speed effect preset from an effects menu 740. As the range 770 starts at playback time t0 to playback time t1, the hold operation would cause the playback to pause/hold at a video frame starting from t0 until t1.
The third stage 702 shows the result of the “hold” operation. A new section 762 appears in the effects bar 760 starting from playback time t0 to playback time t1. The section 762 indicates a playback speed at 0% of normal due to the hold retiming operation. The new thumbnail images 756-758 are sampled between the playback time t0 and t1. Since the playback is on “hold” and thus frozen between t0 and t1, the thumbnail images sampled during this period do not change (remain at index 2.6). Thumbnail images sampled after the section 762 would once again progress according to playback speed of 100%. The third stage 703 also shows the cursor 790 placed over the playback activation item 722 in order to start playback of the media content in the timeline. The preview display area 725 displays a video frame near the start of the first thumbnail (an image with the index 1).
The fourth, fifth, sixth and seventh stages 704-707 show the playback of the composite presentation in the timeline 730 after the hold retiming operation. During stages 704-707, the playback activation item is highlighted, indicating that the playback of the composite presentation is in progress.
At the fourth stage 704, the playhead has moved onto a position immediately after the second thumbnail 752, and the preview display area 720 accordingly displays a video image in the corresponding thumbnail. At the fifth stage 705 and at the sixth stage 706, the playhead 725 traverses a portion of the central compositing lane that has 0% playback speed, and the preview display area 720 is frozen at a video image that is similar to the thumbnail images 756-758 (all have index 2.6).
At the seventh and the final stage 707, the playhead has moved out of the 0% playback speed portion and once again progresses at 100% speed. The preview display area 720 accordingly displays a new video image (index 3).
In some embodiments, the media editing application includes preset operations that repeat a section of a clip or a media clip.
At the first stage 901, the entirety of the media clip 950 is at normal playback speed (100%) as indicated by the effects bar 960. The media clip 950 displays five thumbnail images 951-955, indexed as 1, 2, 3, 4, and 5 at the current zoom level. The preview display area 920 is displaying a video image with an index of 1, since the playhead 925 is at a position near the start of the thumbnail 951 with index 1. The first stage 901 also shows the cursor 990 placed over the retiming activation item 924 for activating retiming operations.
The second stage 902 shows the selection of a range 970 after the activation of the retiming operations (indicated by the highlighting of the item 924). The second stage 902 also illustrates a selection of an “instant replay” preset from an effects menu 940 at a speed of 100%. As the range 970 starts at playback time t0 to playback time t1, the instant replay operation would cause the playback to repeat the media content contained within t0 to t1. One of ordinary skill in the art would recognize that the repeated media content can be at any playback speed. For example, the user can select to perform instant replay at 50% normal speed or at 200% normal speed.
The third stage 903 illustrates the result of the instant replay retiming operation. As illustrated, the media content from t0 and t1 is repeated at t1. The media editing application accordingly has created a new section 962 in the effects bar 960 starting from t1 and ending at t2 corresponding to the repeated media content. The thumbnails in the media clip 950 also reflect the repetition of media content due to the instant replay operation (i.e., the thumbnails repeat indices 2 and 3 within the section 962. The third stage 903 also shows the cursor 990 placed over the playback activation item 922 in order to start playback of the media content in the timeline. The preview display area 925 displays a video frame near the start of the first thumbnail (an image with the index 1).
The fourth, fifth, sixth, seventh, and eighth stages 904-908 show the playback of the composite presentation in the timeline 930 after the instant replay retiming operation. During stages 904-908, the playback activation item 922 is highlighted, indicating that the playback of the composite presentation is commencing.
At the fourth stage 904, the playhead 925 has moved into a position immediately after the start of the second thumbnail 952, and the preview display area 920 accordingly displays a video image similar to the thumbnail 952 (index 2). Likewise at the fifth stage 905, the playhead 925 moved into a position immediate after the start of the third thumbnail 953, and the preview area accordingly displays a video image similar to the thumbnail 953 (index 3).
The sixth and the seventh stages show the playhead traversing the portion of the central compositing lane that has the repeated media content. At the sixth stage 906, the playhead is once again at the start of a thumbnail with index 2 (thumbnail 956), causing the preview display area to display a video image similar to the image in the thumbnail 956. At seventh stage 907, the playhead is once again at the start of a thumbnail with index 3 (thumbnail 957), causing the preview display area to display a video image similar to the image in the thumbnail 957.
At the eighth and the final stage 908, the playhead has moved out of the repeated section 962 and into a section 963. The preview display area 920 accordingly displays a new video image (index 4).
At the first stage 1101, the entirety of the media clip 1150 is at normal playback speed (100%) as indicated by the effects bar 1160. The media clip 1150 displays five thumbnails images 1151-1155, indexed as 1, 2, 3, 4, and 5. The preview display area 1120 is displaying a video image with an index of 1, since the playhead 1125 is at a position near the start of the thumbnail 1151 with index 1. The first stage 1101 also shows the cursor 1190 placed over the retiming activation item 1124 for activating retiming operations.
The second stage 1102 shows the selection of a range 1170 after the activation of the retiming operations (indicated by the highlighting of the item 1124). The second stage 1102 also illustrates a selection of a “rewind” preset from an effects menu 1140 at a speed of ×1 (i.e., 100%). As the range 1170 starts at playback time t0 to playback time t1, the rewind operation would cause the playback to repeat the section between t0 and t1 at a reverse direction starting immediately after t1. One of ordinary skill in the art would recognize that the selected section can be played in reverse at any speed. For example, the user can select to rewind at 50% normal speed or at 200% normal speed.
The third stage 1103 illustrates the result of the rewind retiming operation. As illustrated, the media content from t0 and t1 is repeated at t1 in the reverse direction, and then repeated at t2 in the forward direction (thus giving the visual appearance of video rewind from t1 to t0). The media editing application accordingly has created a new section 1162 in the effects bar 1160 starting from t1 and ending at t2, corresponding to the repeated media content. The media content in the section 1162 performs playback in reverse direction (−100% of normal speed). The thumbnails in the media clip 1150 also reflect the reverse repetition of media content due to the rewind operation (i.e., the newly appeared thumbnails 1156 and 1157 went in reverse direction from index 4 to index 2 in the section 1162). The third stage 1103 also shows the cursor 1190 placed over the playback activation item 1122 in order to start playback of the media content in the timeline. The preview display area 1125 displays a video frame near the start of the first thumbnail (an image with the index 1).
The fourth, fifth, sixth, seventh, and eighth stages 1104-1108 show the playback of the composite presentation in the timeline 1130 after the rewind retiming operation. During stages 1104-1108, the playback activation item 1122 is highlighted, indicating that the playback of the composite presentation is commencing.
At the fourth stage 1104, the playhead 1125 has moved into a position immediately after the start of the second thumbnail 1152, and the preview display area 1120 accordingly displays a video image similar to the thumbnail 1153 (index 3). Likewise at the fifth stage 1105, the playhead 1125 has moved into a position immediate after the start of the thumbnail 1154, and the preview area accordingly displays a video image similar to the thumbnail 1154 (index 4).
The sixth and the seventh stage show the playhead traversing the portion of the central compositing lane that has the reverse media content. At the sixth stage 1106, the playhead is at the start of another thumbnail with index 3 (thumbnail 1156), causing the preview display area to display a video image similar to the image as displayed in the fourth stage 1104. At seventh stage 1107, the playhead 1125 has traversed all the way to the end of the reverse media content. The preview display area now displays a video image similar to the thumbnail 1158 with index 2.
After traversing the reverse playback section 1162, the playback resumes in forward direction. The eighth and the final stage 1108 shows the resumption of the forward playback. The playhead 1125 is immediate after the start of a thumbnail with index 3.
In some embodiments, the video playback of a composite presentation being composited by the media editing application is conducted at a particular frame rate. However, the source media content (e.g., source video clips in the media library) that is used to construct the composite presentation may not have the same frame rate. In such cases, some embodiments construct interpolated frames in order to convert frames from a native frame rate to the particular frame rate of the composite presentation. The interpolation of video frames will be discussed further below in Section III.
In some embodiments, the media editing application provides a retiming speed effect preset that plays every frame of a video clip at a rate that conforms with the particular frame rate of the media editing application. For example, a media editing application in some embodiments plays at a standard frame rate of 24 frames per second, while a piece of high resolution media content produced by a high speed camera may have 60 frames per second. Playing such high resolution piece of media at 100% normal speed requires down sampling of the frames (e.g., playing back only two frames for every five available.) Some embodiments provide a “conform speed” preset that plays every frame of the piece of high resolution media within a selected range at the standard 24 frames per second. The result is a section that plays every frame of the high resolution media content, albeit at a slower rate of 40% of normal speed (i.e., 24/60).
One of ordinary skill in the art would recognize that the “conform speed” retiming operation is applicable regardless of the source frame rates and the playback frame rates. For some embodiments,
The second stage shows the selection of the “conform speed” retiming operation preset from an effects menu 1340. The “conform speed” retiming operation will be applied to the media content in the selected range 1370 between the playback times t0 and t1.
The final and third stage illustrates the result of the retiming operation. A new section 1362 has appeared in the effects bar 1360 that correspond to the selected range 1370. The new section 1362 ends at t′1 instead of t1 because its duration is three times as long as the selected range 1370. This section corresponds to a portion of media content that is to be played back at 33.3% normal speed because every frame of the source media content is being played back. Since the source frame rate of the media content is three times the playback frame rate, the playback speed is effectively reduced to 33.3% of normal. The thumbnail images under the section 1362 reflect the reduced playback speed, as they are thumbnails indexed at increments of 1 (6, 7, 8, 9, 10, 11 . . . ).
The section of the curve 1400 after t0 and before t′1 corresponds to the section with the “conform speed” retiming speed effect. During this section of the curve, every frame of the media content is played, but media time is elapsing at only ⅓ of previous rate, showing a 33% playback speed.
II. Speed Effects Manipulations
As mentioned above, media editing application of some embodiments provides preset speed effects so a user can quickly achieve an initial retiming effect on a selected range in the composite presentation. The initial retiming effect can then be used as a template for further adjustments and refinements by the user for desired result.
The media clip 1550 also displays an effects bar 1560 that indicates the playback speed of the media content in the container. The effects bar is partitioned into three sections 1561-1563 by one or multiple previous preset speed effect operations such as the “slower” operation discussed earlier by reference to
The second stage 1502 in
In some embodiments, the graphical expansion of a speed effect section is accompanied by graphical stretching of thumbnail images in that section. As illustrated, the thumbnails 1551-1553 have been graphically stretched along with the section 1563.
The third stage 1503 in
The fourth stage 1504 in
The fifth stage 1505 in
The sixth and final stage 1506 in
In addition to adjusting playback speed of individual sections of the effects bar of a media clip, a user can also adjust the range of individual sections partitioned by retiming presets.
Within the timeline 1600 is a media clip (or media container) 1650. The media clip displays a series of thumbnail images sampled at regular intervals of the playback time from the media clip 1650.
The media clip 1650 also displays an effects bar 1660 that indicates the playback speed of the media content in the container. The effects bar is partitioned into three sections 1661-1663 by one or multiple previous preset speed effect operations such as the “slower” or “faster” retiming operations discussed earlier by reference to
The second stage 1602 shows the opening of the contextual menu 1640 and the selection of a command specific to the section 1661. The contextual menu 1640 is opened as a result of the selection of the UI item 1682. As illustrated, the contextual menu item includes commands such as “slow”, “fast”, “normal” and “change end” that are specific to the section 1661. The command “slow” slows the playback speed of the section 1661. The command “faster” accelerates the playback speed of the section 1661. The command “normal” reset the playback speed of the section 1661 to 100% of normal speed. One of ordinary skill in the art would recognize that these commands are similar to the preset “slower” and “faster” operations as illustrate above by reference to
The “change end” command is also a contextual command applicable only to the section 1661. It changes the position of the end of the section 1661 and the start of the section 1662 in media time. In other words, the “change end” command shifts the border between the sections 1661 and 1662 such that some media content that were once in section 1662 becomes part of the section 1661 (or vice versa).
The third stage 1603 shows the appearance of a “change end” tool UI item 1695 after the invocation of the “change end command”. The “change end” tool UI item 1695 is situated at the border (at t1) between the sections 1661 and 1662 so the user can manipulate the UI item 1695 to shift the border between the two sections. In some embodiments, the preview display area displays a video image that is being pointed to by the UI item 1695 in order to facilitate the precise placement of the border. In some embodiments, such a preview display area is similar to the preview display area 720 of GUI 700 in
The fourth and final stage 1604 shows the result of the range adjustment operation by manipulation of the “change end” tool UI item 1695. As illustrated, the user has used the cursor 1690 to drag the UI item 1695 from t1 to t′1. However, since the section 1661 has a different playback speed (100%) than the section 1662 (50%), the border shift made by the “change end” operation causes all media content after t′1 to shift. Consequently, the keyframe 1611 has shifted from t1 to t′1 by extending the 100% speed section (section 1661), which causes the 50% section of the playback curve 1610 to start later at t′1. The end of the 50% section (section 1662) must still end at the same media time as before (i.e., 5), which occurs at an earlier playback time t′2, causing the 150% section (section 1663) and the keyframe 1613 to shift earlier in playback time.
III. Mapping of Playback Curves
A. Interpolation
The playback curve as discussed above in Sections I and II maps an instant in playback time to an instant in media time. In some embodiments, the mapped media time is then used to fetch a frame from the source media clip. However, not all media time instants mapped by the playback curve has a video frame that is specified to be displayed at that precise moment. For example, a playback curve may specify a media time that is temporally half way between two video frames in the original media clip. In these instances, it can be unclear as to which video frame should the media editing application fetch for display. In such cases, some embodiments produce an interpolated frame based on other frames in the video clip that are in the vicinity of the media time.
For some embodiments,
As illustrated, the playback curve 1700 maps the playback time instant P1 (at playback time 18 or playback video frame 18) to the media time instant M1 (media time 21), which precisely or very nearly maps to frame count 14 on the frame count scale 1710. Not all playback video frames (or integer playback times) map precisely onto an actual video frame in the source media clip. For example, the playback time instant P2 (at playback time 15) maps to media time instant M2 (media time 10) and then to a position 6.7 in the frame count scale 1700 that is positioned between source video frames 6 and 7. Likewise the playback time instant P3 (at playback time 7) maps to media time instant M3 (media time 6.3), which maps to a position 4.2 in the frame count scale 1700 which is positioned between the source video frame 4 and 5.
In some embodiments, a playback time instant that maps to a media time instant sufficiently close to an actual video frame (on the frame count scale) does not require an interpolated frame. In some of these embodiments, the difference in media time between the position of the actual video frame and the mapped playback position is compared against a threshold. If the difference is within such a threshold, interpolation will not be performed and the actual video frame (the nearest frame to the mapped playback position) is fetched directly for playback.
For a playback time instant that does not map to a media time instant sufficiently close to an actual video frame, some embodiments generate an interpolated frame. In some other embodiments, interpolation is always performed, even if a playback time instant maps exactly on to an actual video frame in the source media content.
Some embodiments perform interpolation by using the nearest neighboring video frame in the source media content as the interpolated frame. For example, for playback frame 7 (P3) that maps on to position 4.2 in the frame count scale (M3), the actual frame 4 in the source media content will be used as the interpolated frame and be displayed during playback.
Some embodiments perform interpolation by blending video frames. In some of these embodiments, frames neighboring the mapped playback position are blended together to produce the interpolated frame. In some embodiments, frame blending is performed by applying a weighting factor to each of the blended source frames according to the temporal distance between the blended source frame and the mapped playback position. In the example of
Some embodiments perform interpolation by optical flow. Optic flow is the pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer (an eye or a camera) and the scene. Sequences of ordered images allow the estimation of motion as either instantaneous image velocities or discrete image displacements. Some embodiments create the interpolated frame by estimating motions of pixels using ordered images of frames neighboring the mapped playback position.
For some embodiments,
The process 1800 next uses (at 1830) the playback curve to look up a media time that corresponds to the received playback time. The process determines (at 1835) whether the media time is sufficiently aligned with an actual frame. If the media time being looked up is sufficiently aligned with an actual frame, the process 1800 proceeds to 1860 to retrieve the actual frame for display. Otherwise the process proceeds to 1840. In some embodiments, the process always proceeds to 1840 and create an interpolated frame regardless of whether the media time is sufficiently aligned with the actual frame.
At 1840, the process creates an interpolated frame based on the media time. The mapping (or look up) of media time and the creation of interpolated frame are described above by reference to
At 1860, the process retrieves an actual frame based on the mapped media time. The process 1800 then displays (at 1870) or deliver the retrieved frame. After displaying or delivering the retrieved frame, the process 1800 ends.
B. Curve Smoothing
As mentioned earlier by reference to stage 206 of
In some embodiments, the curve smooth operation is a spline interpolation operation based on keyframes. Some embodiments plug the coordinates of the keyframes as data set into standard mathematical expressions for drawing a smoothed curve. In order to prevent interpolated curve from overshooting and result in unintended fluctuation in playback speed, some embodiments use Monotone Cubic Interpolation technique for performing the curve smooth operation. Monotone Cubic Interpolation is a variant of cubic interpolation that preserves monotonicity of the data set being interpolated. See e.g., pages 238-246 of Fritsch, F. N.; Carlson, R. E. (1980) “Monotone Piecewise Cubic Interpolation”; SIAM Journal on Numerical Analysis (SIAM) 17 (2). Some embodiments specifically set certain parameters for performing the spline interpolation for preventing overshoot and ensuring monotonicity. These parameters cannot be directly accessed or adjusted by the user in some embodiments.
C. Audio Mapping
In some embodiments, each speed effect or retiming operation performed on video content is accompanied by a corresponding speed effect or retiming operation on audio content. A preset retiming operation performed on a selected section of a media clip applies to both video and audio of the selected section of the media clip. In some embodiments that use a same playback curve for both audio and video, every alteration or adjustment to the playback curve (such as retiming or curve smoothing) applies to both video and audio in order to keep audio and video in sync. Each playback time instant is mapped to a media time instant using the playback curve for both video and audio. The slope of the playback curve at each of the playback time instants is used to determine the instantaneous audio playback speed.
In some embodiments, retiming operation affects the playback speed of audio but not the pitch of audio. Some embodiments use common audio pitch preservation techniques to ensure that changes in playback speed would not affect the pitch of audio during playback.
IV. Speed Effects and Anchor Clips
Some embodiments of the media editing application supports anchored clips that are anchored to a particular video frame during playback. And anchored clip is a clip that is set to be displayed (if video) or played (if audio) starting at the playback time of that particular video frame. As retiming operations such as discussed above in Sections I through III can change the timing of any particular video frame, some embodiments ensure that the anchored clip remain anchored to the correct video frame after the retiming operations. Some of these embodiments maps the anchor frame to an anchor media time TA, and then uses the anchor media time TA to map to the correct new playback time after the retiming operation. Some embodiments reverse map the anchor media time TA to a unique anchor playback time for the anchored clip by marking a section of the playback curve defined by two keyframes as being associated with the anchored clip.
In addition to the media clip 1950, the timeline 1900 includes an anchored clip 1970 that is attached to an anchor frame (frame A). The anchored clip 1970 is set to play when frame A is being displayed. The anchored clip can be a cut away clip that will be displayed instead of the video in the central compositing lane 1950 following frame A, or a graphic effect that is to be overlaid on top of the video in the central compositing lane 1950.
The playback curve 1900 includes two keyframes 1911 and 1912 at playback time t0 and playback time t1 that corresponds to the selected range 1980. The anchor frame (frame A) is at playback time t2, which maps to an anchor media time TA according to the playback curve 1910.
The second stage 1902 shows effect of the “faster” retiming operation on the anchored clip 1970. The “faster” retiming operation has caused a new 200% playback speed section 1962 to appear in the effects bar 1960 that correspond to the selected range 1980. The compression of the media content in the selected range 1980 also causes media content after playback time t0 to shift, as illustrated by the shifting of the keyframe 1911 from t1 to t′1. The retiming operation has also shifted the anchored clip 1970 from t2 to a new anchor playback time t′2. Some embodiments determine this new payback time t′2 for the anchored clip by mapping the anchor media time TA back to the playback time axis using the retimed playback curve 1910 in the second stage 1902.
In some embodiments, retiming operation of the central compositing lane affects only playback speed of media content in the central compositing lane and not the playback speed of the anchored clip. In the example of
As mentioned, some embodiments use the anchor media time TA for determining a new playback time for anchored clip by mapping TA back to a new anchor playback time using the playback curve. However, some retiming operations repeats a portion of the media clip (such as instant replay or rewind) such that the playback curve cannot guarantee one to one mapping between media time and playback time. In these instances, the anchor media time TA may map to multiple anchor playback times. In order to determine which of the multiple anchor playback times is the intended by the user, some embodiments keep track of user interaction with the anchored clip in order to determine which section of the media clip or clip is the anchored clip supposed to be anchored to. Some embodiments accomplish this by marking a section of the playback curve defined by two keyframes as being associated with the anchored clip.
The playback curve 2000 includes two keyframes 2011 and 2012 at playback time t0 and playback time t1 that corresponds to the selected range 2080. The anchor frame (frame A) is at playback time t2, which maps to an anchor media time TA according to the playback curve 2010. The insertion of the keyframes 2011 and 2012 places the anchor for the anchored clip 2070 in a section of the playback curve defined by the keyframes 2011 and 2012. The media editing application thus marks section defined by these two keyframes as being associated with the anchored clip 2070 for future reverse mapping of anchor playback time.
The second stage 2002 shows the result of the rewind operation. As illustrated, the rewind operation has created a new section 2062 at −100% playback speed (rewind at ×1 speed). The rewind operation also caused the media content between t0 and t1 to repeat three times: forward from t0 to t1, reverse from t1 to t3, and forward again from t3 to t4. The playback curve 2010 similarly shows the repetition of media content: forward from keyframe 2011 at t0 to the keyframe 2012 at t1, reverse from the keyframe 2012 at t1 to a new keyframe 2013 at t3, and forward from the keyframe 2013 to playback time t4. The anchor media time TA, as a result of the rewind operation, thus maps to three playback times at t2, t′2 and t″2.
Since the last user interaction with the anchored clip anchors the anchored clip to the section of the playback curve between 2010 between the keyframes 2011 and 2012, some embodiments use this information to uniquely map the anchor media time TA to the anchor playback time t2, which is between the keyframes 2011 and 2012.
The third stage 2003 shows a user action that moves the anchored clip 2070 from t2 to a new playback time t6. The new playback time is mapped to a new anchor media time T′A. This user action now becomes the last user action associated with the anchored clip 2070, and the section between the keyframe 2013 and the keyframe 2014 (at the end of the playback curve at t5) is marked for future mapping of the new anchor media time T′A.
The fourth stage 2004 shows the selection of another range 2082 and the invocation of another retiming operation to be applied to the selected range. The range 2081 starts at the beginning of the playback curve at playback time 0 and ends at t1. An effects menu 2042 is used to select a “faster” retiming preset that changes the media content within the range 2082 to playback at 200% of the normal speed.
The final stage 2005 shows the result of the “faster” retiming operation and the new playback time for the anchored clip 2070. The effects bar section 2061 corresponds to the selected range 2082 and indicates playback speed of 200%. The media content subsequent to the selected range have all shifted due to the increase in playback speed in the section 2061. After the “faster” retiming operation, the anchor media time T′A maps to three different playback times t7, t′7 and t″7. Since t7 is the only playback time mapped by the playback curve to fall between the keyframe 2013 and 2014, the media editing application determines that t7 as the unique anchor playback time for the anchored clip 2070.
V. Software Architecture
In some embodiments, the processes described above are implemented as software running on a particular machine, such as a computer or a handheld device, or stored in a computer readable medium.
The media editing application 2100 includes a user interface (UI) interaction module 2105, a retiming module 2130, and a playback control module 2120. The media editing application 2100 also includes playback curve storage 2145, video source storage 2155, and audio source storage 2165. In some embodiments, storages 2145, 2155, and 2165 are all stored in one physical storage 2190. In other embodiments, the storages are in separate physical storages, or two of the storages are in one physical storage, while the third storage is in a different physical storage. For instance, the video source storage 2155 and the audio source storage 2165 will often not be separated in different physical storages.
The input device drivers 2172 may include drivers for translating signals from a keyboard, mouse, touchpad, drawing tablet, touchscreen, etc. A user interacts with one or more of these input devices, which send signals to their corresponding device driver. The device driver then translates the signals into user input data that is provided to the UI interaction module 2105.
The media editing application 2100 of some embodiments includes a graphical user interface that provides users with numerous ways to perform different sets of operations and functionalities. In some embodiments, these operations and functionalities are performed based on different commands that are received from users through different input devices (e.g., keyboard, trackpad, touchpad, mouse, etc.). For example, the present application illustrates the use of a cursor in the graphical user interface to control (e.g., select, move) objects in the graphical user interface. However, in some embodiments, objects in the graphical user interface can also be controlled or manipulated through other controls, such as touch control. In some embodiments, touch control is implemented through an input device that can detect the presence and location of touch on a display of the input device. An example of a device with such functionality is a touch screen device (e.g., as incorporated into a smart phone, a tablet computer, etc.). In some embodiments with touch control, a user directly manipulates objects by interacting with the graphical user interface that is displayed on the display of the touch screen device. For instance, a user can select a particular object in the graphical user interface by simply touching that particular object on the display of the touch screen device. As such, when touch control is utilized, a cursor may not even be provided for enabling selection of an object of a graphical user interface in some embodiments. However, when a cursor is provided in a graphical user interface, touch control can be used to control the cursor in some embodiments.
The display module 2180 translates the output of a user interface for a display device. That is, the display module 2180 receives signals (e.g., from the UI interaction module 2105) describing what should be displayed and translates these signals into pixel information that is sent to the display device. The display device may be an LCD, plasma screen, CRT monitor, touchscreen, etc. In some embodiments, the display module 2180 also receives signals from the playback control module 2120 for displaying video images from a composite presentation that the media editing application is composing.
The audio module 2185 translates the output of a user interface for a sound producing device that translates digital audio signals into actual sounds. In some embodiment, the audio module 2185 also receives digital audio signals from the playback control module for playing sound produced from a composite presentation the media editing application is composing.
The network connection interface 2174 enable the device on which the media editing application 2100 operates to communicate with other devices (e.g., a storage device located elsewhere in the network that stores the raw audio data) through one or more networks. The networks may include wireless voice and data networks such as GSM and UMTS, 802.11 networks, wired networks such as Ethernet connections, etc.
The UI interaction module 2105 of media editing application 2100 interprets the user input data received from the input device drivers 2172 and passes it to various modules, including the retiming module 2130 and the playback control module 2120. The UI interaction module also manages the display of the UI, and outputs this display information to the display module 2180. This UI display information may be based on information from the playback control module 2120 or directly from the video source storage 2155 and audio source storage 2165. In some embodiments, the UI interaction module 2105 includes a range selector module 2115 for processing user selection of a range in a media clip for retiming operation.
The playback curve storage 2145 receives and stores playback curve generated and adjusted by the retiming module 2130. The playback curve stored can be accessed for further adjustment by the retiming module, or be accessed and used to retrieve images from the video source 2155 and audio source 2165 by the playback control module 2120. The video source storage 2155 receives and stores video data from the UI interaction module 2105 or an operating system 2170. The audio source storage 2165 likewise receives and stores audio data from the UI interaction module and the operating system 2170.
The retiming module (or retiming engine) 2130 generates and adjusts playback curves. In some embodiments, the retiming module generates a new playback curve and stores it in the playback curve storage 2145 whenever a new media clip or a new media clip is created. The retiming module also receives retiming commands and associated parameters from the UI interaction module 2105. The retiming module 2130 uses the received retiming command to insert keyframes and adjust the playback curve. In addition to adjusting the curve according to retiming, the retiming module 2130 also performs curve smoothing operation on the playback curve.
The playback control module 2120 retrieves images from the video source storage 2155 and produces video frames for display at the display module 2180. The playback control module fetches the images based on the playback curve stored in the playback curve storage 2145 and produces interpolated frames for the display module 2180. The playback control module 2120 also produces audio for the audio module 2185 in the operating system 2170 based on audio data retrieved from the audio source storage 2165 and the playback curve.
While many of the features have been described as being performed by one module (e.g., the playback control module 2120 and the retiming module 2130) one of ordinary skill in the art will recognize that the functions described herein might be split up into multiple modules. Similarly, functions described as being performed by multiple different modules might be performed by a single module in some embodiments. For example, the retiming module 2130 can be implemented using sub-modules such as a playback curve generator 2132, a playback curve adjuster 2134 and a playback curve smoother 2136. The playback curve generator 2132 generates playback curves. The playback curve adjuster 2134 adjusts playback curves according to retiming commands. The playback curve smoother 2136 performs curve smoothing operation on the playback curve. Likewise the playback control module 2120 can be implemented using sub-modules such as an image fetcher 2122, a frame interpolator 2124 and an audio processing module 2126. The frame interpolator 2124 creates interpolated frame based on the fetched video images and the playback curve. The image fetcher fetches video images from the video source storage 2155 based on playback time instant and the playback curve 2145. The audio processing module 2126 likewise uses playback curve 2145 to determine both the playback speed and the playback position (in media time) of the audio.
VI. Electronic System
Many of the above-described features and applications are implemented as software processes that are specified as a set of instructions recorded on a computer readable storage medium (also referred to as computer readable medium). When these instructions are executed by one or more computational or processing unit(s) (e.g., one or more processors, cores of processors, or other processing units), they cause the processing unit(s) to perform the actions indicated in the instructions. Examples of computer readable media include, but are not limited to, CD-ROMs, flash drives, random access memory (RAM) chips, hard drives, erasable programmable read only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), etc. The computer readable media does not include carrier waves and electronic signals passing wirelessly or over wired connections.
In this specification, the term “software” is meant to include firmware residing in read-only memory or applications stored in magnetic storage which can be read into memory for processing by a processor. Also, in some embodiments, multiple software inventions can be implemented as sub-parts of a larger program while remaining distinct software inventions. In some embodiments, multiple software inventions can also be implemented as separate programs. Finally, any combination of separate programs that together implement a software invention described here is within the scope of the invention. In some embodiments, the software programs, when installed to operate on one or more electronic systems, define one or more specific machine implementations that execute and perform the operations of the software programs.
The bus 2205 collectively represents all system, peripheral, and chipset buses that communicatively connect the numerous internal devices of the electronic system 2200. For instance, the bus 2205 communicatively connects the processing unit(s) 2210 with the read-only memory 2230, the GPU 2215, the system memory 2220, and the permanent storage device 2235.
From these various memory units, the processing unit(s) 2210 retrieves instructions to execute and data to process in order to execute the processes of the invention. The processing unit(s) may be a single processor or a multi-core processor in different embodiments. Some instructions are passed to and executed by the GPU 2215. The GPU 2215 can offload various computations or complement the image processing provided by the processing unit(s) 2210. In some embodiments, such functionality can be provided using CoreImage's kernel shading language.
The read-only-memory (ROM) 2230 stores static data and instructions that are needed by the processing unit(s) 2210 and other modules of the electronic system. The permanent storage device 2235, on the other hand, is a read-and-write memory device. This device is a non-volatile memory unit that stores instructions and data even when the electronic system 2200 is off. Some embodiments of the invention use a mass-storage device (such as a magnetic or optical disk and its corresponding disk drive) as the permanent storage device 2235.
Other embodiments use a removable storage device (such as a floppy disk, flash memory device, etc., and its corresponding disk drive) as the permanent storage device. Like the permanent storage device 2235, the system memory 2220 is a read-and-write memory device. However, unlike storage device 2235, the system memory 2220 is a volatile read-and-write memory, such a random access memory. The system memory 2220 stores some of the instructions and data that the processor needs at runtime. In some embodiments, the invention's processes are stored in the system memory 2220, the permanent storage device 2235, and/or the read-only memory 2230. For example, the various memory units include instructions for processing multimedia clips in accordance with some embodiments. From these various memory units, the processing unit(s) 2210 retrieves instructions to execute and data to process in order to execute the processes of some embodiments.
The bus 2205 also connects to the input and output devices 2240 and 2245. The input devices 2240 enable the user to communicate information and select commands to the electronic system. The input devices 2240 include alphanumeric keyboards and pointing devices (also called “cursor control devices”), cameras (e.g., webcams), microphones or similar devices for receiving voice commands, etc. The output devices 2245 display images generated by the electronic system or otherwise output data. The output devices 2245 include printers and display devices, such as cathode ray tubes (CRT) or liquid crystal displays (LCD), as well as speakers or similar audio output devices. Some embodiments include devices such as a touchscreen that function as both input and output devices.
Finally, as shown in
Some embodiments include electronic components, such as microprocessors, storage and memory that store computer program instructions in a machine-readable or computer-readable medium (alternatively referred to as computer-readable storage media, machine-readable media, or machine-readable storage media). Some examples of such computer-readable media include RAM, ROM, read-only compact discs (CD-ROM), recordable compact discs (CD-R), rewritable compact discs (CD-RW), read-only digital versatile discs (e.g., DVD-ROM, dual-layer DVD-ROM), a variety of recordable/rewritable DVDs (e.g., DVD-RAM, DVD-RW, DVD+RW, etc.), flash memory (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic and/or solid state hard drives, read-only and recordable Blu-Ray® discs, ultra density optical discs, any other optical or magnetic media, and floppy disks. The computer-readable media may store a computer program that is executable by at least one processing unit and includes sets of instructions for performing various operations. Examples of computer programs or computer code include machine code, such as is produced by a compiler, and files including higher-level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
While the above discussion primarily refers to microprocessor or multi-core processors that execute software, some embodiments are performed by one or more integrated circuits, such as application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs). In some embodiments, such integrated circuits execute instructions that are stored on the circuit itself. In addition, some embodiments execute software stored in programmable logic devices (PLDs), ROM, or RAM devices.
As used in this specification and any claims of this application, the terms “computer”, “server”, “processor”, and “memory” all refer to electronic or other technological devices. These terms exclude people or groups of people. For the purposes of the specification, the terms display or displaying means displaying on an electronic device. As used in this specification and any claims of this application, the terms “computer readable medium,” “computer readable media,” and “machine readable medium” are entirely restricted to tangible, physical objects that store information in a form that is readable by a computer. These terms exclude any wireless signals, wired download signals, and any other ephemeral signals.
While the invention has been described with reference to numerous specific details, one of ordinary skill in the art will recognize that the invention can be embodied in other specific forms without departing from the spirit of the invention. In addition, a number of the figures (including
The present application claims the benefit of U.S. Provisional Patent Application 61/443,692, entitled “Retiming Media Presentations,” filed Feb. 16, 2011. The above-mentioned provisional application is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5404316 | Klingler et al. | Apr 1995 | A |
5440348 | Peters et al. | Aug 1995 | A |
5442744 | Piech et al. | Aug 1995 | A |
5453846 | Tsao et al. | Sep 1995 | A |
5467288 | Fasciano et al. | Nov 1995 | A |
5519828 | Rayner | May 1996 | A |
5521841 | Arman et al. | May 1996 | A |
5524244 | Robinson et al. | Jun 1996 | A |
5559945 | Beaudet et al. | Sep 1996 | A |
5613909 | Stelovsky | Mar 1997 | A |
5634020 | Norton | May 1997 | A |
5659539 | Porter et al. | Aug 1997 | A |
5659792 | Walmsley | Aug 1997 | A |
5659793 | Escobar et al. | Aug 1997 | A |
5664216 | Blumenau | Sep 1997 | A |
5682326 | Klingler et al. | Oct 1997 | A |
5732184 | Chao et al. | Mar 1998 | A |
5752029 | Wissner | May 1998 | A |
5760767 | Shore et al. | Jun 1998 | A |
5781188 | Amiot et al. | Jul 1998 | A |
5812204 | Baker et al. | Sep 1998 | A |
5826102 | Escobar et al. | Oct 1998 | A |
5838381 | Kasahara et al. | Nov 1998 | A |
5892506 | Hermanson | Apr 1999 | A |
5892507 | Moorby et al. | Apr 1999 | A |
5893062 | Bhadkamkar et al. | Apr 1999 | A |
5930446 | Kanda | Jul 1999 | A |
5940573 | Beckwith | Aug 1999 | A |
5999173 | Ubillos | Dec 1999 | A |
5999220 | Washino | Dec 1999 | A |
6005621 | Linzer et al. | Dec 1999 | A |
6057833 | Heidmann et al. | May 2000 | A |
6061062 | Venolia | May 2000 | A |
6122411 | Shen et al. | Sep 2000 | A |
6134380 | Kushizaki | Oct 2000 | A |
6144375 | Jain et al. | Nov 2000 | A |
6154600 | Newman et al. | Nov 2000 | A |
6154601 | Yaegashi et al. | Nov 2000 | A |
6161115 | Ohanian | Dec 2000 | A |
6172675 | Ahmad et al. | Jan 2001 | B1 |
6182109 | Sharma et al. | Jan 2001 | B1 |
6184937 | Williams et al. | Feb 2001 | B1 |
6188396 | Boezeman et al. | Feb 2001 | B1 |
6204840 | Petelycky et al. | Mar 2001 | B1 |
6211869 | Loveman et al. | Apr 2001 | B1 |
6229850 | Linzer et al. | May 2001 | B1 |
6243133 | Spaulding et al. | Jun 2001 | B1 |
6262776 | Griffits | Jul 2001 | B1 |
6281420 | Suzuki et al. | Aug 2001 | B1 |
6324335 | Kanda | Nov 2001 | B1 |
6366296 | Boreczky et al. | Apr 2002 | B1 |
6370198 | Washino | Apr 2002 | B1 |
6392710 | Gonsalves et al. | May 2002 | B1 |
6400378 | Snook | Jun 2002 | B1 |
6404978 | Abe | Jun 2002 | B1 |
6414686 | Protheroe et al. | Jul 2002 | B1 |
6469711 | Foreman et al. | Oct 2002 | B2 |
6476826 | Plotkin et al. | Nov 2002 | B1 |
6477315 | Ohomori | Nov 2002 | B1 |
6486896 | Ubillos | Nov 2002 | B1 |
6487565 | Schechter et al. | Nov 2002 | B1 |
6539163 | Sheasby et al. | Mar 2003 | B1 |
RE38079 | Washino et al. | Apr 2003 | E |
6542692 | Housekeeper | Apr 2003 | B1 |
6544294 | Greenfield et al. | Apr 2003 | B1 |
6546188 | Ishii et al. | Apr 2003 | B1 |
6546399 | Reed et al. | Apr 2003 | B1 |
6559868 | Alexander et al. | May 2003 | B2 |
6573898 | Mathur et al. | Jun 2003 | B1 |
6628303 | Foreman et al. | Sep 2003 | B1 |
6629104 | Parulski et al. | Sep 2003 | B1 |
6631240 | Salesin et al. | Oct 2003 | B1 |
6650826 | Hatta | Nov 2003 | B1 |
6658194 | Omori | Dec 2003 | B1 |
6665343 | Jahanghir et al. | Dec 2003 | B1 |
6674955 | Matsui et al. | Jan 2004 | B2 |
6714216 | Abe | Mar 2004 | B2 |
6741996 | Brechner et al. | May 2004 | B1 |
6744968 | Imai et al. | Jun 2004 | B1 |
6763175 | Trottier et al. | Jul 2004 | B1 |
6771285 | McGrath et al. | Aug 2004 | B1 |
6848117 | Emura | Jan 2005 | B1 |
6871003 | Phillips et al. | Mar 2005 | B1 |
6871161 | Laird | Mar 2005 | B2 |
6904566 | Feller et al. | Jun 2005 | B2 |
6928613 | Ishii et al. | Aug 2005 | B1 |
6940518 | Minner et al. | Sep 2005 | B2 |
6947044 | Kulas | Sep 2005 | B1 |
6950836 | Lohn et al. | Sep 2005 | B2 |
6956574 | Cailloux et al. | Oct 2005 | B1 |
6965723 | Abe et al. | Nov 2005 | B1 |
6967599 | Choi et al. | Nov 2005 | B2 |
6970859 | Brechner et al. | Nov 2005 | B1 |
7020381 | Kato et al. | Mar 2006 | B1 |
7035435 | Li et al. | Apr 2006 | B2 |
7035463 | Monobe et al. | Apr 2006 | B1 |
7042489 | Zell et al. | May 2006 | B2 |
7043137 | Slone | May 2006 | B2 |
7062107 | Crosby et al. | Jun 2006 | B1 |
7062713 | Schriever et al. | Jun 2006 | B2 |
7073127 | Zhao et al. | Jul 2006 | B2 |
7079144 | Shimada et al. | Jul 2006 | B1 |
7103260 | Hinson | Sep 2006 | B1 |
7103839 | Natkin et al. | Sep 2006 | B1 |
7155676 | Land et al. | Dec 2006 | B2 |
7171625 | Sacchi | Jan 2007 | B1 |
7194676 | Fayan et al. | Mar 2007 | B2 |
7207007 | Moriwake et al. | Apr 2007 | B2 |
7313755 | Rahman et al. | Dec 2007 | B2 |
7325199 | Reid | Jan 2008 | B1 |
7336264 | Cajolet et al. | Feb 2008 | B2 |
7370335 | White et al. | May 2008 | B1 |
7383509 | Foote et al. | Jun 2008 | B2 |
7398002 | Hsiao et al. | Jul 2008 | B2 |
7411590 | Boyd et al. | Aug 2008 | B1 |
7432940 | Brook et al. | Oct 2008 | B2 |
7434155 | Lee | Oct 2008 | B2 |
7437674 | Chen | Oct 2008 | B2 |
7444593 | Reid | Oct 2008 | B1 |
7480864 | Brook et al. | Jan 2009 | B2 |
7502139 | Nishimura | Mar 2009 | B2 |
7518611 | Boyd et al. | Apr 2009 | B2 |
7539659 | Wong et al. | May 2009 | B2 |
7546532 | Nichols et al. | Jun 2009 | B1 |
7561160 | Fukuya | Jul 2009 | B2 |
7606444 | Erol et al. | Oct 2009 | B1 |
7623755 | Kuspa | Nov 2009 | B2 |
7623756 | Komori et al. | Nov 2009 | B2 |
7653550 | Schulz | Jan 2010 | B2 |
7664336 | Zhang et al. | Feb 2010 | B2 |
7668869 | Weinberger et al. | Feb 2010 | B2 |
7669130 | Agarwal et al. | Feb 2010 | B2 |
7689510 | Lamkin et al. | Mar 2010 | B2 |
7710439 | Reid et al. | May 2010 | B2 |
7720349 | Ogikubo | May 2010 | B2 |
7725828 | Johnson | May 2010 | B1 |
7739299 | Kii et al. | Jun 2010 | B2 |
RE41493 | Marcus | Aug 2010 | E |
7770125 | Young et al. | Aug 2010 | B1 |
7779358 | Gupta et al. | Aug 2010 | B1 |
7805678 | Niles et al. | Sep 2010 | B1 |
7823056 | Davey et al. | Oct 2010 | B1 |
7836389 | Howard et al. | Nov 2010 | B2 |
7856424 | Cisler et al. | Dec 2010 | B2 |
7885472 | Yamamoto | Feb 2011 | B2 |
7889946 | Bourdev | Feb 2011 | B1 |
7889975 | Slone | Feb 2011 | B2 |
7890867 | Margulis | Feb 2011 | B1 |
7925669 | Freeborg et al. | Apr 2011 | B2 |
8209612 | Johnson | Jun 2012 | B2 |
20010000221 | Chen et al. | Apr 2001 | A1 |
20010020953 | Moriwake et al. | Sep 2001 | A1 |
20010033295 | Phillips | Oct 2001 | A1 |
20010036356 | Weaver et al. | Nov 2001 | A1 |
20010040592 | Foreman et al. | Nov 2001 | A1 |
20010056434 | Kaplan et al. | Dec 2001 | A1 |
20020018640 | Bolduc | Feb 2002 | A1 |
20020023103 | Gagne | Feb 2002 | A1 |
20020069218 | Sull et al. | Jun 2002 | A1 |
20020081099 | Tsumagari et al. | Jun 2002 | A1 |
20020089540 | Geier et al. | Jul 2002 | A1 |
20020101368 | Choi et al. | Aug 2002 | A1 |
20020122207 | Klassen et al. | Sep 2002 | A1 |
20020140719 | Amir et al. | Oct 2002 | A1 |
20020154140 | Tazaki | Oct 2002 | A1 |
20020154156 | Moriwake et al. | Oct 2002 | A1 |
20020156805 | Schriever et al. | Oct 2002 | A1 |
20020168176 | Iizuka et al. | Nov 2002 | A1 |
20020188628 | Cooper et al. | Dec 2002 | A1 |
20030001848 | Doyle et al. | Jan 2003 | A1 |
20030002715 | Kowald | Jan 2003 | A1 |
20030002851 | Hsiao et al. | Jan 2003 | A1 |
20030007017 | Laffey et al. | Jan 2003 | A1 |
20030016254 | Abe | Jan 2003 | A1 |
20030018609 | Phillips et al. | Jan 2003 | A1 |
20030053685 | Lestideau | Mar 2003 | A1 |
20030088877 | Loveman et al. | May 2003 | A1 |
20030090504 | Brook et al. | May 2003 | A1 |
20030097400 | Li et al. | May 2003 | A1 |
20030117431 | Moriwake et al. | Jun 2003 | A1 |
20030146915 | Brook et al. | Aug 2003 | A1 |
20030164845 | Fayan | Sep 2003 | A1 |
20030177145 | Lohn et al. | Sep 2003 | A1 |
20030197743 | Hill et al. | Oct 2003 | A1 |
20030234803 | Toyama et al. | Dec 2003 | A1 |
20040001079 | Zhao et al. | Jan 2004 | A1 |
20040001106 | Deutscher et al. | Jan 2004 | A1 |
20040001694 | Evans et al. | Jan 2004 | A1 |
20040012594 | Gauthier et al. | Jan 2004 | A1 |
20040027369 | Kellock et al. | Feb 2004 | A1 |
20040046804 | Chang | Mar 2004 | A1 |
20040056883 | Wierowski | Mar 2004 | A1 |
20040066395 | Foreman et al. | Apr 2004 | A1 |
20040071441 | Foreman et al. | Apr 2004 | A1 |
20040078761 | Ohanian | Apr 2004 | A1 |
20040085354 | Massand | May 2004 | A1 |
20040088723 | Ma et al. | May 2004 | A1 |
20040090462 | Graham | May 2004 | A1 |
20040098379 | Huang | May 2004 | A1 |
20040100482 | Cajolet et al. | May 2004 | A1 |
20040125124 | Kim et al. | Jul 2004 | A1 |
20040131330 | Wilkins et al. | Jul 2004 | A1 |
20040151469 | Engholm et al. | Aug 2004 | A1 |
20040197084 | Tagawa et al. | Oct 2004 | A1 |
20040201609 | Obrador | Oct 2004 | A1 |
20040212637 | Varghese | Oct 2004 | A1 |
20040215643 | Brechner et al. | Oct 2004 | A1 |
20040233806 | Kawahara | Nov 2004 | A1 |
20040257434 | Davis et al. | Dec 2004 | A1 |
20040267952 | He et al. | Dec 2004 | A1 |
20040268224 | Balkus et al. | Dec 2004 | A1 |
20050041029 | Felt | Feb 2005 | A1 |
20050042591 | Bloom et al. | Feb 2005 | A1 |
20050052441 | Stevens | Mar 2005 | A1 |
20050058430 | Nakamura et al. | Mar 2005 | A1 |
20050084232 | Herberger et al. | Apr 2005 | A1 |
20050120127 | Bradley et al. | Jun 2005 | A1 |
20050132293 | Herberger et al. | Jun 2005 | A1 |
20050183041 | Chiu et al. | Aug 2005 | A1 |
20050201724 | Chu | Sep 2005 | A1 |
20050207734 | Howell et al. | Sep 2005 | A1 |
20050213833 | Okada et al. | Sep 2005 | A1 |
20050216840 | Salvucci | Sep 2005 | A1 |
20050238217 | Enomoto et al. | Oct 2005 | A1 |
20050257152 | Shimizu et al. | Nov 2005 | A1 |
20060008247 | Minami et al. | Jan 2006 | A1 |
20060015811 | Tanaka et al. | Jan 2006 | A1 |
20060048057 | Herberger et al. | Mar 2006 | A1 |
20060056716 | Komeno | Mar 2006 | A1 |
20060059426 | Ogikubo | Mar 2006 | A1 |
20060078288 | Huang et al. | Apr 2006 | A1 |
20060101064 | Strong et al. | May 2006 | A1 |
20060112390 | Hamaoka | May 2006 | A1 |
20060136556 | Stevens et al. | Jun 2006 | A1 |
20060150072 | Salvucci | Jul 2006 | A1 |
20060155684 | Liu et al. | Jul 2006 | A1 |
20060156219 | Haot et al. | Jul 2006 | A1 |
20060161635 | Lamkin et al. | Jul 2006 | A1 |
20060161867 | Drucker et al. | Jul 2006 | A1 |
20060168521 | Shimizu et al. | Jul 2006 | A1 |
20060184980 | Cole | Aug 2006 | A1 |
20060224940 | Lee | Oct 2006 | A1 |
20060233514 | Weng et al. | Oct 2006 | A1 |
20060236245 | Agarwal et al. | Oct 2006 | A1 |
20060242122 | Devorchik et al. | Oct 2006 | A1 |
20060242164 | Evans et al. | Oct 2006 | A1 |
20060253781 | Pea et al. | Nov 2006 | A1 |
20060277454 | Chen | Dec 2006 | A1 |
20070016872 | Cummins et al. | Jan 2007 | A1 |
20070022159 | Zhu et al. | Jan 2007 | A1 |
20070053429 | Jawerth et al. | Mar 2007 | A1 |
20070058937 | Ando et al. | Mar 2007 | A1 |
20070061862 | Berger et al. | Mar 2007 | A1 |
20070079321 | Ott, IV | Apr 2007 | A1 |
20070124282 | Wittkotter | May 2007 | A1 |
20070136656 | Nydam et al. | Jun 2007 | A1 |
20070154190 | Gilley et al. | Jul 2007 | A1 |
20070168873 | Lentz | Jul 2007 | A1 |
20070189627 | Cohen et al. | Aug 2007 | A1 |
20070189708 | Lerman et al. | Aug 2007 | A1 |
20070192697 | Kawamura et al. | Aug 2007 | A1 |
20070192729 | Downs | Aug 2007 | A1 |
20070203945 | Louw | Aug 2007 | A1 |
20070204238 | Hua et al. | Aug 2007 | A1 |
20070234214 | Lovejoy et al. | Oct 2007 | A1 |
20070240072 | Cunningham et al. | Oct 2007 | A1 |
20070242085 | Weybrew et al. | Oct 2007 | A1 |
20070260968 | Howard et al. | Nov 2007 | A1 |
20070262995 | Tran | Nov 2007 | A1 |
20070266304 | Fletcher et al. | Nov 2007 | A1 |
20080013916 | Sharpe et al. | Jan 2008 | A1 |
20080034013 | Cisler et al. | Feb 2008 | A1 |
20080044155 | Kuspa | Feb 2008 | A1 |
20080062177 | Gaul et al. | Mar 2008 | A1 |
20080072166 | Reddy | Mar 2008 | A1 |
20080079972 | Goodwin et al. | Apr 2008 | A1 |
20080080721 | Reid et al. | Apr 2008 | A1 |
20080104127 | Billmaier et al. | May 2008 | A1 |
20080110553 | Otsubo | May 2008 | A1 |
20080120328 | Delgo et al. | May 2008 | A1 |
20080126191 | Schiavi | May 2008 | A1 |
20080138034 | Hiroi et al. | Jun 2008 | A1 |
20080152297 | Ubillos | Jun 2008 | A1 |
20080152298 | Ubillos | Jun 2008 | A1 |
20080155420 | Ubillos et al. | Jun 2008 | A1 |
20080155421 | Ubillos et al. | Jun 2008 | A1 |
20080155459 | Ubillos | Jun 2008 | A1 |
20080170553 | Montemurro et al. | Jul 2008 | A1 |
20080172399 | Chi et al. | Jul 2008 | A1 |
20080184121 | Kulas | Jul 2008 | A1 |
20080184290 | Tapuska | Jul 2008 | A1 |
20080222170 | Farnham et al. | Sep 2008 | A1 |
20080253735 | Kuspa | Oct 2008 | A1 |
20080256449 | Bhatt | Oct 2008 | A1 |
20080273862 | Okamoto et al. | Nov 2008 | A1 |
20080288869 | Ubillos | Nov 2008 | A1 |
20080306921 | Rothmuller et al. | Dec 2008 | A1 |
20080317431 | Mishima et al. | Dec 2008 | A1 |
20090006437 | Saito | Jan 2009 | A1 |
20090006475 | Udezue et al. | Jan 2009 | A1 |
20090031239 | Coleran et al. | Jan 2009 | A1 |
20090037605 | Li | Feb 2009 | A1 |
20090063429 | Rudolph | Mar 2009 | A1 |
20090070820 | Li | Mar 2009 | A1 |
20090089690 | Chi et al. | Apr 2009 | A1 |
20090097815 | Lahr et al. | Apr 2009 | A1 |
20090100339 | Wharton-Ali et al. | Apr 2009 | A1 |
20090129479 | Yellamraju | May 2009 | A1 |
20090147004 | Ramon et al. | Jun 2009 | A1 |
20090150947 | Soderstrom | Jun 2009 | A1 |
20090172543 | Cronin et al. | Jul 2009 | A1 |
20090174813 | Washino | Jul 2009 | A1 |
20090182644 | Panagopulos et al. | Jul 2009 | A1 |
20090187864 | Bedell et al. | Jul 2009 | A1 |
20090196346 | Zhang et al. | Aug 2009 | A1 |
20090201316 | Bhatt et al. | Aug 2009 | A1 |
20090204894 | Bhatt et al. | Aug 2009 | A1 |
20090232480 | Jendbro | Sep 2009 | A1 |
20090249185 | Datar et al. | Oct 2009 | A1 |
20090249222 | Schmidt et al. | Oct 2009 | A1 |
20090251475 | Mathur et al. | Oct 2009 | A1 |
20090254825 | Sichart et al. | Oct 2009 | A1 |
20090259623 | Mooneyham et al. | Oct 2009 | A1 |
20090263100 | Neuman | Oct 2009 | A1 |
20100005397 | Lanahan et al. | Jan 2010 | A1 |
20100005417 | Lanahan et al. | Jan 2010 | A1 |
20100005485 | Tian et al. | Jan 2010 | A1 |
20100021125 | Ingrosso et al. | Jan 2010 | A1 |
20100023972 | Summers et al. | Jan 2010 | A1 |
20100040349 | Landy | Feb 2010 | A1 |
20100050080 | Libert et al. | Feb 2010 | A1 |
20100063961 | Guiheneuf et al. | Mar 2010 | A1 |
20100077289 | Das et al. | Mar 2010 | A1 |
20100080528 | Yen et al. | Apr 2010 | A1 |
20100082585 | Barsook et al. | Apr 2010 | A1 |
20100083173 | Germann et al. | Apr 2010 | A1 |
20100088295 | Duan et al. | Apr 2010 | A1 |
20100107126 | Lin et al. | Apr 2010 | A1 |
20100153395 | Hannuksela et al. | Jun 2010 | A1 |
20100153520 | Daun et al. | Jun 2010 | A1 |
20100158471 | Ogikubo | Jun 2010 | A1 |
20100194763 | Niles et al. | Aug 2010 | A1 |
20100241962 | Peterson et al. | Sep 2010 | A1 |
20100246996 | Yamamoto | Sep 2010 | A1 |
20100262710 | Khatib et al. | Oct 2010 | A1 |
20100274673 | Isaac | Oct 2010 | A1 |
20100274674 | Roberts et al. | Oct 2010 | A1 |
20100275121 | Johnson | Oct 2010 | A1 |
20100275123 | Yu et al. | Oct 2010 | A1 |
20100278504 | Lyons et al. | Nov 2010 | A1 |
20100281366 | Langmacher et al. | Nov 2010 | A1 |
20100281367 | Langmacher et al. | Nov 2010 | A1 |
20100281371 | Warner et al. | Nov 2010 | A1 |
20100281377 | Meaney et al. | Nov 2010 | A1 |
20100281378 | Pendergast et al. | Nov 2010 | A1 |
20100281379 | Meaney et al. | Nov 2010 | A1 |
20100281381 | Meaney et al. | Nov 2010 | A1 |
20100281382 | Meaney et al. | Nov 2010 | A1 |
20100281383 | Meaney et al. | Nov 2010 | A1 |
20100281384 | Lyons et al. | Nov 2010 | A1 |
20100281386 | Lyons et al. | Nov 2010 | A1 |
20100287475 | Van et al. | Nov 2010 | A1 |
20100305729 | Glitsch et al. | Dec 2010 | A1 |
20100315366 | Lee et al. | Dec 2010 | A1 |
20100322981 | Bujard et al. | Dec 2010 | A1 |
20100332981 | Lipton et al. | Dec 2010 | A1 |
20110008017 | Gausereide | Jan 2011 | A1 |
20110010624 | Vanslette et al. | Jan 2011 | A1 |
20110026899 | Lussier et al. | Feb 2011 | A1 |
20110030031 | Lussier et al. | Feb 2011 | A1 |
20110047163 | Chechik et al. | Feb 2011 | A1 |
20110072037 | Lotzer | Mar 2011 | A1 |
20110097011 | Lim et al. | Apr 2011 | A1 |
20110103684 | Bhatt et al. | May 2011 | A1 |
20110103772 | Suzuki | May 2011 | A1 |
20110109796 | Subedar et al. | May 2011 | A1 |
20110113331 | Herberger et al. | May 2011 | A1 |
20120301114 | Johnson | Nov 2012 | A1 |
20140169765 | Wang et al. | Jun 2014 | A1 |
Number | Date | Country |
---|---|---|
0702832 | Mar 1996 | EP |
2464123 | Apr 2010 | GB |
9429868 | Dec 1994 | WO |
9914941 | Mar 1999 | WO |
2007120694 | Oct 2007 | WO |
2008151416 | Dec 2008 | WO |
2009026159 | Feb 2009 | WO |
2009114134 | Sep 2009 | WO |
2009128227 | Oct 2009 | WO |
2009129252 | Oct 2009 | WO |
2010106586 | Sep 2010 | WO |
Entry |
---|
Portions of prosecution history of U.S. Appl. No. 12/762,747, filed Jul. 6, 2010, Johnson, Gary. |
Portions of prosecution history of U.S. Appl. No. 10/686,990, filed Apr. 15, 2010, Johnson, Gary. |
Author Unknown, “Apple Announces Final Cut Pro 4,” NAB, Apr. 6, 2003, Apple Inc., Las Vegas, Nevada, USA. |
Brenneis, Lisa, “Final Cut Pro 3 for Macintosh: Visual QuickPro Guide,” Apr. 2002, Peachpit Press, Berkeley, California, USA. |
Martin, Steve, “Final Cut Express System Requirements, OS 10.2 or Higher,” Jan. 13, 2003, Ripple Training. |
Sauer, Jeff, “Review: Apple Final Cut Pro 4,” Oct. 3, 2003. |
Stone, Ken, “Basic Keyframing in Final Cut Express”, Jan. 27, 2003, V. 1.0.1, Ken Stone. |
Stone, Ken, “Motion Paths and the Bezier Handle in FCP,” Aug. 13, 2001, Ken Stone. |
Author Unknown, “Using Adobe Premiere Elements 8 Editor,” last updated Aug. 12, 2010, 313 pages, Adobe Systems Incorporated, San Jose, CA, USA. |
Author Unknown, “Adobe Premiere Pro CS3: User Guide,” Apr. 1, 2008, 455 pages, Adobe Systems Incorporated, San Jose, California, USA. |
Author Unknown, “iMovie '08 Getting Started,” Month Unknown, 2008, pp. 1-50, Apple Inc., USA. |
Updated portions of prosecution history of U.S. Appl. No. 12/762,747, filed May 25, 2012, Johnson, Gary. |
Portions of prosecution history of U.S. Appl. No. 13/720,988, filed May 23, 2014, Wang, Xiaohaun C., et al. |
Portions of prosecution history of U.S. Appl. No. 13/488,423, Johnson, Gary. |
Work with Clips in Windows Movie Maker, by Microsoft. Internet Wayback archive Oct. 13, 2009. |
Weynand, Diana. Apple Pro Training Series: Final Cut Pro for Avid Editors, Fourth Edition. 2010. Reviewed at Safaribooks Online. |
Weingartner, Andy, “Windows Movie Maker 2011 User Guide,” Month Unknown, 2011, pp. 1-42, Microsoft. |
Wang, Yijin, et al. “MyVideos—A System for Home Video Management”, Proceedings of the 10th ACM International Conference on Multimedia '02, Dec. 1-6, 2002, pp. 412-413, Juan-les-Pins, France. |
Ulges, Adrian, et al., “Content-based Video Tagging for Online Video Portals”, Proceedings of the 3rd MUSCLE/Image CLEF Workshop on Image and Video Retrieval Evaluation, Sep. 2007, 1O pages, Budapest, Hungary. |
Shaw, Ryan, et al., “Toward Emergent Representations for Video”, Proceedings of the 13th annual ACM International Conference on Multimedia, Nov. 6-11, 2005, 4 pages, Singapore. |
Oetzmann, Anthony, et al., “Audacity—Editing for Beginners Part 2—Cut, Copy and Paste,” Apr. 12, 2004, http://audacity.sourceforge.neUmanual1.2/tutorial_ed_beginner2.html. |
Myers, Brad A., et al., “A Multi-View Intelligent Editor for Digital Video Libraries”, The First ACM+IEEE Joint Conference on Digital Libraries (JCDL'01), Jun. 24-28, 2001, 1O pages, Roanoke, VA, available at http://www.informedia.cs.cmu.edu/documents/dl2001paper.pdf. |
Movie Maker by Microsoft, from Wayback Machine Jan. 28, 2010. |
Long, A. Chris, et al., “Video Editing Using Lenses and Semantic Zooming”, Human Computer Interaction Institute, Carnegie Mellon University, Month Unknown, 2009, pp. 1-11, Pittsburgh, PA. |
Kutics, Andrea, et al., “Use of Adaptive Still Image Descriptors for Annotation of Video Frames”, Lecture Notes in Computer Science, Month Unknown, 2007, pp. 686-697, vol. 4633, Springer-Verlaq, Berlin, Heidelberg. |
Hwang, Hark-Chin, Soo Y. Chang, and Kangbok Lee. “Parallel machine scheduling under a grade of service provision.” Computers & Operations Research 31, No. 12 (2004): 2055-2061. |
Docuceur, et al. (Docuceur, J., Elson, J., Howell, J., and Lorch, J., “The Utility Coprocessor: Massively parallel computations from the coffee shop”, in USENIX ATC (2010)). |
Diakopoulos, Nicholas, et al., “Videotater: An Approach for Pen-Based Digital Video Segmentation and Tagging”, UIST'06, Oct. 15-18, 2006, 4 pages, Montreux, Switzerland, available at http://www.deakondesiqn.com/Documents/tn151-diakopoulos.pdf. |
Chisan, James, et al., “Video Bench—Final Report: SEng 480a/CSc 586a,” Apr. 11, 2003, pp. 1-43, University of Victoria, Canada. |
Casares, Juan, et al., “Simplifying Video Editing Using Metadata”, Processing of Designing Interactive Systems (DIS 2002), Jun. 2002, 1O pages, London, UK, available at http://www.informedia.cs.cmu.edu/documents/silver-dis02-draft.pdf. |
Bolante, Anthony, “Premiere Pro CS3 for Windows and Macintosh: Visual QuickPro Guide”, Dec. 4, 2007, 2 pages, Peachpit Press, USA. |
Author Unknown, Apple Support Communities Discussions, “How to Change DV/DVCPRO video to 16:9 widescreen aspect ratio in Final Cut Pro X?”, Jul. 2011; (https://discussions.apple.com/thread/3155532?start=0&tstart=0). |
Author Unknown, “Work with clips in Windows Movie Maker,” Oct. 13, 2009, 4 pages, Microsoft, USA. |
Author Unknown, “Using Adobe Premiere Pro CS4”, Apr. 24, 2009, 491 pages, Adobe Systems Incorporated, San Jose, California, USA. |
Author Unknown, “Using Adobe Flash CS4 Professional,” Updated Mar. 5, 2009, 470 pages, Adobe Systems Incorporated, San Jose, California, USA. |
Author Unknown, “Frame-specific editing with snap—Premiere Pro CS4 Classroom in a Book,” Dec. 17, 2008, 17 pages, Adobe Press, USA. |
Author Unknown, “Avid Media Composer Editing Guide”, Avid, Jun. 2010, 1688 pages, Burlington, MA, USA. |
Author Unknown, “Adobe Premiere Pro CS4 Classroom in a Book”, Dec. 17, 2008, 11 pages, Adobe Press, USA. |
Author Unknown, “Adobe Premiere Pro CS3: Classroom in a Book”, Month Unknown, 2008, 27 pages, Chapters 9 and 10, Adobe Systems Incorporated, San Jose, California, USA. |
Author Unknown, “Adobe Premiere Elements 7: Arranging clips in the Sceneline,” Dec. 11, 2008, 3 pages, http://help.adobe.com/en_US/PremiereElements/7.0/WSB04491 A8-859D-41e7-975F-0E26B9AECB9B.html. |
Author Unknown, “Adobe Director 11: User Guide,” Month Unknown, 2008, 498 pages, Adobe Systems Incorporated, San Jose, California, USA. |
Number | Date | Country | |
---|---|---|---|
20120210228 A1 | Aug 2012 | US |
Number | Date | Country | |
---|---|---|---|
61443692 | Feb 2011 | US |