Multimedia players are devices that render combinations of video, audio or data content (“multimedia presentations”) for consumption by users. Multimedia players such as DVD players currently do not provide for much, if any, user interactivity during play of video content—video content play is generally interrupted to receive user inputs other than play speed adjustments. For example, a user of a DVD player must generally stop the movie he is playing to return to a menu that includes options allowing him to select and receive features such as audio commentary, actor biographies, or games.
Interactive multimedia players are devices (such devices may include hardware, software, firmware, or any combination thereof) that render combinations of interactive content concurrently with traditional video, audio or data content (“interactive multimedia presentations”). Although any type of device may be an interactive multimedia player, devices such as optical media players (for example, DVD players), computers, and other electronic devices are particularly well positioned to enable the creation of, and consumer demand for, commercially valuable interactive multimedia presentations because they provide access to large amounts of relatively inexpensive, portable data storage.
Interactive content is generally any user-selectable visible or audible object presentable alone or concurrently with other video, audio or data content. One kind of visible object is a graphical object, such as a circle, that may be used to identify and/or follow certain things within video content—people, cars, or buildings that appear in a movie, for example. One kind of audible object is a click sound played to indicate that the user has selected a visible object, such as the circle, using a device such as a remote control or a mouse. Other examples of interactive content include, but are not limited to, menus, captions, and animations.
To enhance investment in interactive multimedia players and interactive multimedia presentations, it is desirable to ensure accurate synchronization of the interactive content component of interactive multimedia presentations with the traditional video, audio or data content components of such presentations. Accurate synchronization generally prioritizes predictable and glitch-free play of the video, audio or data content components. For example, when a circle is presented around a car in a movie, the movie should generally not pause to wait for the circle to be drawn, and the circle should follow the car as it moves.
It will be appreciated that the claimed subject matter is not limited to implementations that solve any or all of the disadvantages of specific interactive multimedia presentation systems or aspects thereof.
In general, an interactive multimedia presentation includes one or more of the following: a predetermined presentation play duration, a video content component, and an interactive content component. The video content component is referred to as a movie for exemplary purposes, but may in fact be video, audio, data, or any combination thereof. The video content component is arranged into a number of frames and/or samples for rendering by a video content manager. The video content manager receives video data (video, audio, or data samples, or combinations thereof) from one or more sources (such as from an optical medium or another source) at a rate that is based on a timing signal, such as a clock signal. The rate of the timing signal may vary based on the play speed of the video content—the movie may be paused, slow-forwarded, fast-forwarded, slow-reversed, or fast-reversed, for example.
The interactive content is arranged for rendering by an interactive content manager at a rate based on a timing signal, such as a continuous clock signal. The interactive content component of the presentation is in the form of one or more applications. An application includes instructions in declarative form or in script form. One type of declarative form includes extensible markup language (“XML”) data structures. The application instructions are provided for organizing, formatting, and synchronizing the presentation of media objects to a user, often concurrently with the video content component.
Methods, systems, apparatuses, and articles of manufacture discussed herein entail using application instructions to specify a time, or a time interval, when a particular media object is renderable. The time, or time interval, is specifiable with reference to either a first timing signal having a rate that is based on the play speed of the interactive multimedia presentation (such as the timing signal used by the video content manager), or with reference to a second timing signal having a continuous predetermined rate (such as the timing signal used by the interactive content manager).
One example of an application instruction usable as described above is a markup element associated with an XML data structure. An example of an XML data structure is a timing document or container or another type of document or container. An attribute of the markup element that is defined by a particular syntax or usage context of the element may be used to specify times based on either the first or second timing signals. An exemplary attribute is the “clock” attribute that is specified by one or more XML schemas for use in applications associated with high-definition DVD movies.
This Summary is provided to introduce a selection of concepts in a simplified form. The concepts are further described in the Detailed Description section. Elements or steps other than those described in this Summary are possible, and no element or step is necessarily required. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended for use as an aid in determining the scope of the claimed subject matter.
Turning to the drawings, where like numerals designate like components,
In operation, Presentation System 100 handles interactive multimedia presentation content (“Presentation Content”) 120. Presentation Content 120 includes a video content component (“video component”) 122 and an interactive content component (“IC component”) 124. Video component 122 and IC component 124 are generally, but need not be, handled as separate data streams, by AVC manager 102 and IC manager 104, respectively.
Presentation System 100 also facilitates presentation of Presentation Content 120 to a user (not shown) as played presentation 127. Played Presentation 127 represents the visible and/or audible information associated with Presentation Content 120 that is produced by mixer/renderer 110 and receivable by the user via devices such as displays or speakers (not shown). For discussion purposes, it is assumed that Presentation Content 120 and played presentation 127 represent high-definition DVD movie content, in any format. It will be appreciated, however, that Presentation Content 120 and Played Presentation 127 may be any type of interactive multimedia presentation now known or later developed.
Video component 122 represents the traditional video, audio or data components of Presentation Content 120. For example, a movie generally has one or more versions (a version for mature audiences, and a version for younger audiences, for example); one or more titles 131 with one or more chapters (not shown) associated with each title (titles are discussed further below, in connection with presentation manager 106); one or more audio tracks (for example, the movie may be played in one or more languages, with or without subtitles); and extra features such as director's commentary, additional footage, trailers, and the like. It will be appreciated that distinctions between titles and chapters are purely logical distinctions. For example, a single perceived media segment could be part of a single title/chapter, or could be made up of multiple titles/chapters. It is up to the content authoring source to determine the applicable logical distinctions. It will also be appreciated that although video component 122 is referred to as a movie, video component 122 may in fact be video, audio, data, or any combination thereof.
The video, audio, or data that forms video component 122 originates from one or more media sources 160 (for exemplary purposes, two media sources 160 are shown within A/V manager 102). A media source is any device, location, or data from which video, audio, or data is derived or obtained. Examples of media sources include, but are not limited to, networks, hard drives, optical media, alternate physical disks, and data structures referencing storage locations of specific video, audio, or data.
Groups of samples of video, audio, or data from a particular media source are referred to as clips 123 (shown within video component 122, AVC manager 102, and playlist 128). Referring to AVC manager 102, information associated with clips 123 is received from one or more media sources 160 and decoded at decoder blocks 161. Decoder blocks 161 represent any devices, techniques or steps used to retrieve renderable video, audio, or data content from information received from a media source 160. Decoder blocks 161 may include encoder/decoder pairs, demultiplexers, or decrypters, for example. Although a one-to-one relationship between decoders and media sources is shown, it will be appreciated that one decoder may serve multiple media sources, and vice-versa.
Audio/video content data (“A/V data”) 132 is data associated with video component 122 that has been prepared for rendering by AVC manager 102 and transmitted to mixer/renderer 110. Frames of A/V data 134 generally include, for each active clip 123, a rendering of a portion of the clip. The exact portion or amount of the clip rendered in a particular frame may be based on several factors, such as the characteristics of the video, audio, or data content of the clip, or the formats, techniques, or rates used to encode or decode the clip.
IC component 124 includes media objects 125, which are user-selectable visible or audible objects, optionally presentable concurrently with video component 122, along with any instructions (shown as applications 155 and discussed further below) for presenting the visible or audible objects. Media objects 125 may be static or animated. Examples of media objects include, among other things, video samples or clips, audio samples or clips, graphics, text, and combinations thereof.
Media objects 125 originate from one or more sources (not shown). A source is any device, location, or data from which media objects are derived or obtained. Examples of sources for media objects 125 include, but are not limited to, networks, hard drives, optical media, alternate physical disks, and data structures referencing storage locations of specific media objects. Examples of formats of media objects 125 include, but are not limited to, portable network graphics (“PNG”), joint photographic experts group (“JPEG”), moving picture experts group (“MPEG”), multiple-image network graphics (“MNG”), audio video interleave (“AVI”), extensible markup language (“XML”), hypertext markup language (“HTML”), and extensible HTML (“XHTML”).
Applications 155 provide the mechanism by which Presentation System 100 presents media objects 125 to a user. Applications 155 represent any signal processing method or stored instruction(s) that electronically control predetermined operations on data. It is assumed for discussion purposes that IC component 124 includes three applications 155, which are discussed further below in connection with
Interactive content data (“IC data”) 134 is data associated with IC component 124 that has been prepared for rendering by IC manager 104 and transmitted to mixer/renderer 110. Each application has an associated queue (not shown), which holds one or more work items (not shown) associated with rendering the application.
Presentation manager 106, which is configured for communication with both AVC manager 104 and IC manager 102, facilitates handling of Presentation Content 120 and presentation of played presentation 127 to the user. Presentation manager 106 has access to a playlist 128. Playlist 128 includes, among other things, a time-ordered sequence of clips 123 and applications 155 (including media objects 125) that are presentable to a user. The clips 123 and applications 155/media objects 125 may be arranged to form one or more titles 131. For exemplary purposes, one title 131 is discussed herein. Playlist 128 may be implemented using an extensible markup language (“XML”) document, or another data structure.
Presentation manager 106 uses playlist 128 to ascertain a presentation timeline 130 for title 131. Conceptually, presentation timeline 130 indicates the times within title 131 when specific clips 123 and applications 155 are presentable to a user. A sample presentation timeline 130, which illustrates exemplary relationships between presentation of clips 123 and applications 155 is shown and discussed in connection with
Presentation manager 106 provides information, including but not limited to information about presentation timeline 130, to AVC manager 102 and IC manager 104. Based on input from presentation manger 206, AVC manager 102 prepares A/V data 132 for rendering, and IC manager 104 prepares IC data 134 for rendering.
Timing signal management block 108 produces various timing signals 158, which are used to control the timing for preparation and production of A/V data 132 and IC data 134 by AVC manager 102 and IC manager 104, respectively. In particular, timing signals 158 are used to achieve frame-level synchronization of A/V data 132 and IC data 134. Details of timing signal management block 108 and timing signals 158 are discussed further below, in connection with
Mixer/renderer renders A/V data 132 in a video plane (not shown), and renders IC data 134 in a graphics plane (not shown). The graphics plane is generally, but not necessarily, overlayed onto the video plane to produce played presentation 127 for the user.
With continuing reference to
The particular amount of time along horizontal axis 220 in which title 131 is presentable to the user is referred to as play duration 292 of title 131. Specific times within play duration 292 are referred to as title times. Four title times (“TTs”) are shown on presentation timeline 130—TT1293, TT2294, TT3295, and TT4296. Because a title may be played once or may be played more than once (in a looping fashion, for example) play duration 292 is determined based on one iteration of title 131. Play duration 292 may be determined with respect to any desired reference, including but not limited to a predetermined play speed (for example, normal, or 1×, play speed), a predetermined frame rate, or a predetermined timing signal status. Play speeds, frame rates, and timing signals are discussed further below, in connection with
Other times and/or durations within play duration 292 are also defined and discussed herein. Video presentation intervals 240 are defined by beginning and ending times of play duration 292 between which particular content associated with video component 122 is playable. For example, video clip 1230 has a presentation interval 240 between title times TT2294 and TT4294, and video clip 2250 has a presentation interval 240 between title times TT3295 and TT4296. Application presentation intervals, application play durations, page presentation intervals, and page durations are also defined and discussed below, in connection with
With continuing reference to
A second type of time interval is one in which video component 122 is scheduled for presentation. Time interval 2298 and time interval 3299 are examples of the second type of time interval. Sometimes, more than one video may be scheduled for presentation during the second type of time interval. Often, but not always, interactive content is presentable during the second type of time interval. For example, in time interval 2298, menu 280 and graphic overlay 290 are scheduled for presentation concurrently with video clip 230. In time interval 3299, menu 280 is scheduled for concurrent presentation with video clip 1230 and video clip 2250.
With continuing reference to
Application play duration 320 is a particular amount of time, with reference to an amount (a part or all) of play duration 292 within which media objects 125 associated with application 155 are presentable to and/or selectable by a recipient of played presentation 127. In the context of
The intervals defined by beginning and ending title times obtained when an application play duration 320 associated with a particular application is conceptualized on presentation timeline are referred to as application presentation intervals 321. For example, the application responsible for copyright notice 206 has an application presentation interval beginning at TT1293 and ending at TT2294, the application responsible for menu 280 has an application presentation interval beginning at TT2294 and TT4296, and the application responsible for graphic overlay 290 has an application presentation interval beginning at TT2294 and TT3295.
Referring again to
The number of applications and pages associated with a given title, and the media objects associated with each application or page, are generally logical distinctions that are matters of design choice. Multiple pages may be used when it is desirable to manage (for example, limit) the number or amount of resources associated with an application that are loaded into memory during execution of the application. Resources for an application include the media objects used by the application, as well as instructions 304 for rendering the media objects. For example, when an application with multiple pages is presentable, it may be possible to only load into memory only those resources associated with a currently presentable page of the application.
Resource package data structure 340is used to facilitate loading of application resources into memory prior to execution of the application. Resource package data structure 340 references memory locations where resources for that application are located. Resource package data structure 340 may be stored in any desirable location, together with or separate from the resources it references. For example, resource package data structure 340 may be disposed on an optical medium such a high-definition DVD, in an area separate from video component 122. Alternatively, resource package data structure 340 may be embedded into video component 122. In a further alternative, the resource package data structure may be remotely located. One example of a remote location is a networked server. Topics relating to handling the transition of resources for application execution, and between applications, are not discussed in detail herein.
Referring again to application 155 itself, instructions 304, when executed, perform tasks related to rendering of media objects 125 associated with application 155, based on user input. One type of user input (or a result thereof) is a user event. User events are actions or occurrences initiated by a recipient of played presentation 127 that relate to IC component 124. User events are generally, but not necessarily, asynchronous. Examples of user events include, but are not limited to, user interaction with media objects within played presentation 127, such as selection of a button within menu 280, or selection of the circle associated with graphical overlay 290. Such interactions may occur using any type of user input device now known or later developed, including a keyboard, a remote control, a mouse, a stylus, or a voice command. It will be appreciated that application 155 may respond to events other than user events, but such events are not specifically discussed herein.
In one implementation, instructions 304 are computer-executable instructions encoded in computer-readable media (discussed further below, in connection with
Script 308 includes instructions 304 written in a non-declarative programming language, such as an imperative programming language. An imperative programming language describes computation in terms of a sequence of commands to be performed by a processor. In most cases where script 308 is used, the script is used to respond to user events. Script 308 is useful in other contexts, however, such as handling issues that are not readily or efficiently implemented using markup elements alone. Examples of such contexts include system events and resource management (for example, accessing cached or persistently stored resources). In one implementation, script 308 is ECMAScript as defined by ECMA International in the ECMA-262 specification. Common scripting programming languages falling under ECMA-262 include JavaScript and JScript. In some settings, it may be desirable to implement 308 using a subset of ECMAScript 262, such as ECMA-327, along with a host environment and a set of application programming interfaces.
Markup elements 302, 306, 310, 312, and 360 represent instructions 304 written in a declarative programming language, such as Extensible Markup Language (“XML”). In XML, elements are logical units of information defined, using start-tags and end-tags, within XML documents. XML documents are data objects that are made up of storage units called entities (also called containers), which contain either parsed or unparsed data. Parsed data is made up of characters, some of which form character data, and some of which form markup. Markup encodes a description of the document's storage layout and logical structure. There is one root element in an XML document, no part of which appears in the content of any other element. For all other elements, the start-tags and end-tags are within the content of other elements, nested within each other.
An XML schema is a definition of the syntax(es) of a class of XML documents. One type of XML schema is a general-purpose schema. Some general-purpose schemas are defined by the World Wide Web Consortium (“W3C”). Another type of XML schema is a special-purpose schema. In the high-definition DVD context, for example, one or more special-purpose XML schemas have been promulgated by the DVD Forum for use with XML documents in compliance with the DVD Specifications for High Definition Video. It will be appreciated that other schemas for high-definition DVD movies, as well as schemas for other interactive multimedia presentations, are possible.
At a high level, an XML schema includes: (1) a global element declaration, which associates an element name with an element type, and (2) a type definition, which defines attributes, sub-elements, and character data for elements of that type. Attributes of an element specify particular properties of the element using a name/value pair, with one attribute specifying a single element property.
Content elements 302, which may include user event elements 360, are used to identify particular media object elements 312 presentable to a user by application 155. Media object elements 312, in turn, generally specify locations where data defining particular media objects 125 is disposed. Such locations may be, for example, locations in persistent local or remote storage, including locations on optical media, or on wired or wireless, public or private networks, such as on the Internet, privately-managed networks, or the World Wide Web. Locations specified by media object elements 312 may also be references to locations, such as references to resource package data structure 340. In this manner, locations of media objects 125 may be specified indirectly.
Timing elements 306 are used to specify the times at, or the time intervals during, which particular content elements 302 are presentable to a user by a particular application 155. Examples of timing elements include par, timing, or seq elements within a time container of an XML document.
Style elements 310 are generally used to specify the appearance of particular content elements 302 presentable to a user by a particular application.
User event elements 360 represent content elements 302, timing elements 306 or style elements 310 that are used to define or respond to user events.
Markup elements 302, 306, 310, and 360 have attributes that are usable to specify certain properties of their associated media object elements 312/media objects 125. In one implementation, these attributes/properties represent values of one or more clocks or timing signals (discussed further below, in connection with
A sample XML document containing markup elements is set forth below (script 308 is not shown). The sample XML document includes style 310 and timing 306 elements for performing a crop animation on a content element 302, which references a media object element 312 called “id.” The location of data defining media object 125 associated with the “id” media object element is not shown.
The sample XML document begins with a root element called “xml.” Following the root element, several namespace “xmlns” fields refer to locations on the World Wide Web where various schemas defining the syntax for the sample XML document, and containers therein, can be found. In the context of an XML document for use with a high-definition DVD movie, for example, the namespace fields may refer to websites associated with the DVD Forum.
One content element 302 referred to as “id” is defined within a container described by tags labeled “body.” Style elements 310 (elements under the label “styling” in the example) associated with content element “id” are defined within a container described by tags labeled “head.” Timing elements 306 (elements under the label “timing”) are also defined within the container described by tags labeled “head.”
With continuing reference to
Timing signal management block 108 is responsible for the handling of clocks and/or timing signals that are used to determine specific times or time durations within Presentation System 100. As shown, a continuous timing signal 401 is produced at a predetermined rate by a clock source 402. Clock source 402 may be a clock associated with a processing system, such as a general-purpose computer or a special-purpose electronic device. Timing signal 401 produced by clock source 402 generally changes continually as a real-world clock would—within one second of real time, clock source 402 produces, at a predetermined rate, one second worth of timing signals 401. Timing signal 401 is input to IC frame rate calculator 404, A/V frame rate calculator 406, time reference calculator 408, and time reference calculator 490.
IC frame rate calculator 404 produces a timing signal 405 based on timing signal 401. Timing signal 405 is referred to as an “IC frame rate,” which represents the rate at which frames of IC data 134 are produced by IC manager 104. One exemplary value of the IC frame rate is 30 frames per second. IC frame rate calculator 404 may reduce or increase the rate of timing signal 401 to produce timing signal 405.
Frames of IC data 134 generally include, for each valid application 155 and/or page thereof, a rendering of each media object 125 associated with the valid application and/or page in accordance with relevant user events. For exemplary purposes, a valid application is one that has an application presentation interval 321 within which the current title time of play duration 292 falls, based on presentation timeline 130. It will be appreciated that an application may have more than one application presentation interval. It will also be appreciated that no specific distinctions are made herein about an application's state based on user input or resource availability.
A/V frame rate calculator 406 also produces a timing signal—timing signal 407—based on timing signal 401. Timing signal 407 is referred to as an “A/V frame rate,” which represents the rate at which frames of A/V data 132 are produced by AVC manager 102. The A/V frame rate may be the same as, or different from, IC frame rate 405. One exemplary value of the A/V frame rate is24 frames per second. A/V frame rate calculator 406 may reduce or increase the rate of timing signal 401 to produce timing signal 407.
A clock source 470 produces timing signal 471, which governs the rate at which information associated with clips 123 is produced from media source(s) 161. Clock source 470 may be the same clock as clock 402, or based on the same clock as clock source 402. Alternatively, clocks 470 and 402 may be altogether different, and/or have different sources. Clock source 470 adjusts the rate of timing signal 471 based on a play speed input 480. Play speed input 480 represents user input received that affects the play speed of played presentation 127. Play speed is affected, for example, when a user jumps from one part of the movie to another (referred to as “trick play”), or when the user pauses, slow-forwards, fast-forwards or slow-reverses, or fast-reverses the movie. Trick play may be achieved by making selections from menu 280 (shown in
Time references 452 represent the amounts of time that have elapsed within particular presentation intervals 240 associated with active clips 123. For purposes of discussion herein, an active clip is one that has a presentation interval 240 within which the current title time of play duration 292 falls, based on presentation timeline 130. Time references 452 are referred to as “elapsed clip play time(s).” Time reference calculator 454 receives time references 452 and produces a media time reference 455. Media time reference 455 represents the total amount of play duration 292 that has elapsed based on one or more time references 452. In general, when two or more clips are playing concurrently, only one time reference 452 is used to produce media time reference 455. The particular clip used to determine media time reference 455, and how media time reference 455 is determined based on multiple clips, is a matter of implementation preference.
Time reference calculator 408 receives timing signal 401, media time reference 455, and play speed input 480, and produces a title time reference 409. Title time reference 409 represents the total amount of time that has elapsed within play duration 292 based on one or more of the inputs to time reference calculator 408. An exemplary method for calculating title time is shown and described in connection with
Time reference calculator 490 receives timing signal 401 and title time reference 409, and produces application time reference(s) 492 and page time reference(s) 494. A single application time reference 492 represents an amount of elapsed time of a particular application play duration 320 (shown and discussed in connection with
Page time reference 494 represents an amount of elapsed time of a single page play duration 332, 337 (also shown and discussed in connection with
Table 1 illustrates exemplary occurrences during play of played presentation 127 by Presentation System 100, and the effects of such occurrences on application time reference 492, page time reference 494, title time reference 409, and media time reference 455.
The movie begins playing when the timing signal has a value of zero. When the timing signal has a value of 10, the application becomes valid and activates. Application time 492, as well as page time 494 associated with page one of the application, assumes a value of zero. Pages two and three are inactive. Title time 409 and media time 455 both have values of 10.
Page two of the application loads at timing signal value 15. The application time and page one time have values of 5, while the title time and the media time have values of 15.
Page three of the application loads when the timing signal has a value of 20. The application time has a value of 10, page two time has a value of 5, and page one time is inactive. The title time and the media time have values of 20.
The movie pauses at timing signal value 22. The application time has a value of 12, page three time has a value of two, and pages one and two are inactive. The title time and media time have values of 22. The movie resumes at timing signal value 24. Then, the application time has a value of 14, page three time has a value of four, and the title time and media time have values of 22.
At timing signal value 27, a new clip starts. The application time has a value of 17, page three time has a value of 7, the title time has a value of 25, and the media time is re-set to zero.
A user de-activates the application at timing signal value 32. The application time has a value of 22, the page time has a value of 12, the title time has a value of 30, and the media time has a value of 5.
At timing signal value 39, the user jumps, backwards, to another portion of the same clip. The application is assumed to be valid at the jumped-to location, and re-activates shortly thereafter. The application time has a value of 0, page one time has a value of zero, the other pages are inactive, the title time has a value of 27, and the media time has a value of 2.
At timing signal value 46, the user changes the play speed of the movie, fast-forwarding at two times the normal speed. Fast-forwarding continues until timing signal value 53. As shown, the application and page times continue to change at a constant pace with the continuous timing signal, unaffected by the change in play speed of the movie, while the title and media times change in proportion to the play speed of the movie. It should be noted that when a particular page of the application is loaded is tied to title time 409 and/or media time 455 (see discussion of application presentation interval(s) 321 and page presentation interval(s) 343, in connection with
At timing signal value 48, a new title begins, and title time 409 and media time 455 are re-set to values of zero. With respect to the initial title, this occurs when the title time has a value of 62, and the media time has a value of 36. Re-setting (not shown) of application time 492 and page time 494 follows re-setting of title time 409 and media time 455.
Having access to various timelines, clock sources, timing signals, and timing signal references enhances the ability of Presentation System 100 to achieve frame-level synchronization of IC data 124 and A/V data 132 within played presentation 127, and to maintain such frame-level synchronization during periods of user interactivity.
With continuing reference to
The method begins at block 600, and continues at block 602, where a non-video time interval within the play duration of the presentation is identified. A non-video time interval is one in which video component 122 is not scheduled for presentation. Although video component 122 may not be scheduled for presentation, it will be appreciated that other video (video data associated with applications 155, for example), may be scheduled for presentation.
One way in which the non-video time interval may be identified is with reference to play duration 292 on presentation timeline 130, which is ascertained from a playlist for the presentation, such as playlist 128. Referring for exemplary purposes to
Referring again to
A video time interval within the play duration of the presentation is identified at block 606. A video time interval is one in which video component 122 is scheduled for presentation. It will be appreciated that video component 122 may include video, audio, data, or any combination thereof, and does not only represent visual information. In the exemplary presentation timeline 130 shown in
Referring again to
When played presentation 127 proceeds during a video time interval, as indicated at diamond 610 and subsequent box 612 of
If, as indicated at diamond 614 and subsequent box 616, played presentation 127 proceeds during a non-video time interval, the total elapsed play time, title time 409, is determined using the first elapsed play time. Accordingly, during the non-video time interval, accurate advancement of played presentation 127 is achieved by calculating title time 409 based on a continuous timing signal, such as timing signal 401.
It is desirable to recognize the transition from one type of time interval to another at least one unit of title time 409 in advance of the transition, to facilitate accurate calculation of title time 409 based on either the play speed-based timing signal (timing signal 471 and/or media time reference 455) or the continuous timing signal (timing signal 401). For example, prior to transition from a non-video interval to a video interval, the first frame of A/V data 132 (e.g., the first frame of the main video clip) to be presented in the video interval could be prepared for rendering. Then, the first frame of A/V data 132 would be presentable at the title time when it is scheduled for presentation based on presentation timeline 130. Likewise, prior to transition from a video interval to a non-video interval, the first frame if IC data 134 could be pre-rendered.
With continuing reference to
In the context of Presentation System 100, Presentation Content 120/played presentation 127 has play duration 292. IC component 124 includes application 155 having instructions 304 for rendering one or more media objects 125. Application 155 has application play duration 320 that in the context of play duration 292 is represented by application presentation interval 321. Video component 122 includes one or more clips 123.
The method begins at block 700, and continues at block 702, where a first timing signal is produced based on the play speed of the presentation. In the context of Presentation System 100, timing signal 471 is produced by clock source 470, which adjusts the rate of timing signal 471 based on play speed input 480.
At block 704, a second timing signal is produced at a continuous predetermined rate. In the context of Presentation System 100, timing signal 401 is produced by clock source 402.
A title time reference is formed at block 706. In the context of Presentation System 100, time reference calculator 408 forms title time reference 409 by measuring an elapsed play time of play duration 292 based on timing signal 401. Title time reference 409 may be based indirectly on timing signal 471 produced by clock source 470, as discussed in connection with
At diamond 708, it is determined whether the title time is within the application presentation interval. When the title time is not within the application presentation interval, the application is deemed inactive, at block 715. If the title time is within the application presentation interval, then the application is valid as discussed above. In the context of Presentation System 100, when title time reference 409 falls within an applicable application presentation interval 321, the associated application 155 is deemed valid.
At diamond 710, it is further determined whether application resources (for example, resources referenced by resource package data structure 340) are loaded. If necessary, resource loading is performed at block 712. In the context of Presentation System 100, prior to playing a particular application 155, such as when the application initially becomes valid, or when the application becomes valid based on a change in the play speed of the presentation (after trick play, for example), resources for application 155 are loaded into a memory, such as a file cache. Resources include media objects 125 associated with the application, as well as instructions 304 for rendering the media objects. Media objects 125 and instructions 304 for a particular application are collectively referred to as a resource package. As discussed in connection with
Referring again to the flowchart of
At diamond 716, it is determined whether the current elapsed play time is within an applicable page presentation interval, and if so, a page time reference is formed at block 718. The page time reference is formed by measuring an elapsed play time of the applicable page play duration 332, 337 based on the second timing signal (timing signal 401). If the current elapsed play time is not within an applicable page presentation interval, the applicable page is deemed inactive, at block 717. In the context of Presentation System 100, when title time reference 409 falls within an applicable page presentation interval 343, page time reference 494 is formed.
Application and page time references may re-set when application presentation intervals end, or in other circumstances, such as in response to user events or play speed inputs 480. For example, after trick play, assuming title time 409 is within application presentation interval 321, application (and page time references, as applicable) may re-start (at zero or another starting value).
At block 720, an instruction is associated with a media object. In the context of Presentation System 100, one type of instruction is instruction 304 associated with application 155. Instruction 304 represents one or more declarative language data structures, such as XML markup elements 302, 306, 310, 312, 360 or attributes thereof, used alone or in combination with script 308, to reference states of one or more clocks or timing signals for the purpose of establishing times at, or time durations within, which media object(s) 125 are rendered. Markup elements within content containers, timing containers, or style containers may refer to, or have one or more attributes that refer to, timing signal 401 or timing signal 471.
Elements and attributes thereof can refer to timing signal 401 and/or timing signal 407 directly or indirectly. For example, timing signal 401 may be referred to indirectly via clock source 402, IC frame rate calculator 404, AN frame rate calculator 406, application time 492, or page time 494. Likewise, timing signal 407 may be referred to indirectly via clock source 470, elapsed clip play time(s) 452, time reference calculator 454, media time reference 455, time reference calculator 408, or title time reference 409, for example.
In one example, one or more attributes may be defined in special-purpose XML schemas, such as XML schemas for use with certain high-definition DVD movies. One example of such an attribute is referred to herein as a “clock attribute,” which is defined by one or more XML schemas promulgated by the DVD Forum for use with XML documents in compliance with the DVD Specifications for High Definition Video. The clock attribute is usable with various elements in content, timing, or style containers to refer directly or indirectly to timing signal 401 or timing signal 471. In another example, the par, timing, or seq elements within a time container may refer to, or may have one or more attributes that refer to, timing signal 401 or timing signal 471. In this manner, markup elements within timing containers of XML documents may be used to define media object presentation intervals 345 with reference to both page time and title time. In yet another example, timer elements may also be defined that can be used by an application to be notified when some specific duration has elapsed. In a further example, user events and other types of events may be defined by times linked to different time scales. Times, or time intervals, for which a particular event is valid may be established by referring to timing signal 401 or timing signal 471.
Expressions involving logical references to clocks, timing signals, time reference calculators, and/or time references may also be used to define conditions for presenting media objects 125 using elements or attributes of elements in XML documents. For example, Boolean operands such as “AND,” “OR,” and “NOT”, along with other operands or types thereof, may be used to define such expressions or conditions.
As indicated at diamond 722 and block 724, the media object is rendered when, based on the instruction, the time for rendering the media object is reached. It will be appreciated that a media object is not always rendered, because user input may dictate whether and when the media object is rendered.
In the context of Presentation System 100, during execution of a particular application 155, a document object model (“DOM”) tree (not shown) associated with the application maintains the context for the state of the markup elements, and a script host (not shown) associated with the application maintains the context for the script's variables, functions, and other states. As execution of application instructions 304 progresses and user input is received, the properties of any affected elements are recorded and may be used to trigger behavior of media objects 125 within played presentation 127. It can be seen that synchronization between interactive and video components of Presentation Content 120/played presentation 127 is achieved based on one or more clocks outside of the DOM, rather than clocks associated with the DOM.
Work items (not shown) resulting from execution of instructions 304 are placed in queue(s) (not shown), and are performed at a rate provided by IC frame rate 405. IC data 134 resulting from performance of work items is transmitted to renderer/mixer 110. Mixer/renderer 110 renders IC data 134 in the graphics plane to produce the interactive portion of played presentation 127 for the user.
The processes illustrated in
A processor 802 is responsive to computer-readable media 804 and to computer programs 806. Processor 802, which may be a real or a virtual processor, controls functions of an electronic device by executing computer-executable instructions. Processor 802 may execute instructions at the assembly, compiled, or machine-level to perform a particular process. Such instructions may be created using source code or any other known computer program design tool.
Computer-readable media 804 represent any number and combination of local or remote devices, in any form, now known or later developed, capable of recording, storing, or transmitting computer-readable data, such as the instructions executable by processor 802. In particular, computer-readable media 804 may be, or may include, a semiconductor memory (such as a read only memory (“ROM”), any type of programmable ROM (“PROM”), a random access memory (“RAM”), or a flash memory, for example); a magnetic storage device (such as a floppy disk drive, a hard disk drive, a magnetic drum, a magnetic tape, or a magneto-optical disk); an optical storage device (such as any type of compact disk or digital versatile disk); a bubble memory; a cache memory; a core memory; a holographic memory; a memory stick; a paper tape; a punch card; or any combination thereof. Computer-readable media 804 may also include transmission media and data associated therewith. Examples of transmission media/data include, but are not limited to, data embodied in any form of wireline or wireless transmission, such as packetized or non-packetized data carried by a modulated carrier signal.
Computer programs 806 represent any signal processing methods or stored instructions that electronically control predetermined operations on data. In general, computer programs 806 are computer-executable instructions implemented as software components according to well-known practices for component-based software development, and encoded in computer-readable media (such as computer-readable media 804). Computer programs may be combined or distributed in various ways.
Functions/components described in the context of Presentation System 100 are not limited to implementation by any specific embodiments of computer programs. Rather, functions are processes that convey or transform data, and may generally be implemented by, or executed in, hardware, software, firmware, or any combination thereof, located at, or accessed by, any combination of functional elements of Presentation System 100.
With continued reference to
As shown, operating environment 900 includes or accesses components of computing unit 800, including processor 802, computer-readable media 804, and computer programs 806. Storage 904 includes additional or different computer-readable media associated specifically with operating environment 900, such as an optical disc, which is handled by optical disc drive 906. One or more internal buses 920, which are well-known and widely available elements, may be used to carry data, addresses, control signals and other information within, to, or from computing environment 900 or elements thereof.
Input interface(s) 908 provide input to computing environment 900. Input may be collected using any type of now known or later-developed interface, such as a user interface. User interfaces may be touch-input devices such as remote controls, displays, mice, pens, styluses, trackballs, keyboards, microphones, scanning devices, and all types of devices that are used input data.
Output interface(s) 910 provide output from computing environment 900. Examples of output interface(s) 910 include displays, printers, speakers, drives (such as optical disc drive 906 and other disc drives), and the like.
External communication interface(s) 912 are available to enhance the ability of computing environment 900 to receive information from, or to transmit information to, another entity via a communication medium such as a channel signal, a data signal, or a computer-readable medium. External communication interface(s) 912 may be, or may include, elements such as cable modems, data terminal equipment, media players, data storage devices, personal digital assistants, or any other device or component/combination thereof, along with associated network support devices and/or software or interfaces.
On client-side 1002, one or more clients 1006, which may be implemented in hardware, software, firmware, or any combination thereof, are responsive to client data stores 1008. Client data stores 1008 may be computer-readable media 804, employed to store information local to clients 1006. On server-side 1004, one or more servers 1010 are responsive to server data stores 1012. Like client data stores 1008, server data stores 1012 may be computer-readable media 804, employed to store information local to servers 1010.
Various aspects of an interactive multimedia presentation system that is used to present interactive content to a user synchronously with audio/video content have been described. An interactive multimedia presentation has been generally described as having a play duration, a variable play speed, a video component, and an IC component. It will be understood, however, that all of the foregoing components need not be used, nor must the components, when used, be present concurrently. Functions/components described in the context of Presentation System 100 as being computer programs are not limited to implementation by any specific embodiments of computer programs. Rather, functions are processes that convey or transform data, and may generally be implemented by, or executed in, hardware, software, firmware, or any combination thereof.
Although the subject matter herein has been described in language specific to structural features and/or methodological acts, it is also to be understood that the subject matter defined in the claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
It will further be understood that when one element is indicated as being responsive to another element, the elements may be directly or indirectly coupled. Connections depicted herein may be logical or physical in practice to achieve a coupling or communicative interface between elements. Connections may be implemented, among other ways, as inter-process communications among software processes, or inter-machine communications among networked computers.
The word “exemplary” is used herein to mean serving as an example, instance, or illustration. Any implementation or aspect thereof described herein as “exemplary” is not necessarily to be constructed as preferred or advantageous over other implementations or aspects thereof.
As it is understood that embodiments other than the specific embodiments described above may be devised without departing from the spirit and scope of the appended claims, it is intended that the scope of the subject matter herein will be governed by the following claims.
This application claims the benefit of provisional application No. 60/695,944, filed Jul. 1, 2005, which is incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
5195092 | Wilson et al. | Mar 1993 | A |
5208745 | Quentin et al. | May 1993 | A |
5394547 | Correnti et al. | Feb 1995 | A |
5452435 | Malouf et al. | Sep 1995 | A |
5515490 | Buchanan et al. | May 1996 | A |
5608859 | Taguchi | Mar 1997 | A |
5631694 | Aggarwal et al. | May 1997 | A |
5659539 | Porter et al. | Aug 1997 | A |
5694560 | Uya et al. | Dec 1997 | A |
5717468 | Baryla | Feb 1998 | A |
5758008 | Tozaki et al. | May 1998 | A |
5760780 | Larson et al. | Jun 1998 | A |
5794018 | Vrvilo et al. | Aug 1998 | A |
5809512 | Kato | Sep 1998 | A |
5877763 | Berry | Mar 1999 | A |
5949410 | Fung | Sep 1999 | A |
5966121 | Hubbell et al. | Oct 1999 | A |
5995095 | Ratakonda | Nov 1999 | A |
6067638 | Benitz et al. | May 2000 | A |
6069633 | Apparao et al. | May 2000 | A |
6100881 | Gibbons et al. | Aug 2000 | A |
6122433 | McLaren | Sep 2000 | A |
6212595 | Mendel | Apr 2001 | B1 |
6369830 | Brunner et al. | Apr 2002 | B1 |
6384846 | Hiroi | May 2002 | B1 |
6385596 | Wiser et al. | May 2002 | B1 |
6414686 | Protheroe et al. | Jul 2002 | B1 |
6426778 | Valdez, Jr. | Jul 2002 | B1 |
6430570 | Judge et al. | Aug 2002 | B1 |
6442658 | Hunt et al. | Aug 2002 | B1 |
6453459 | Brodersen et al. | Sep 2002 | B1 |
6466733 | Kim | Oct 2002 | B1 |
6505153 | Van Thong et al. | Jan 2003 | B1 |
6564382 | Duquesnoois et al. | May 2003 | B2 |
6565153 | Hensel et al. | May 2003 | B2 |
6577341 | Yamada et al. | Jun 2003 | B1 |
6628283 | Gardner | Sep 2003 | B1 |
6629150 | Huded | Sep 2003 | B1 |
6642939 | Vallone et al. | Nov 2003 | B1 |
6665835 | Gutfreund et al. | Dec 2003 | B1 |
6700588 | MacInnis et al. | Mar 2004 | B1 |
6715126 | Chang et al. | Mar 2004 | B1 |
6785729 | Overby et al. | Aug 2004 | B1 |
6892351 | Vasudevan et al. | May 2005 | B2 |
6906643 | Samadani et al. | Jun 2005 | B2 |
6920613 | Dorsey et al. | Jul 2005 | B2 |
6925499 | Chen et al. | Aug 2005 | B1 |
7120859 | Wettach | Oct 2006 | B2 |
7131143 | LaMacchia et al. | Oct 2006 | B1 |
7200357 | Janik et al. | Apr 2007 | B2 |
7210037 | Samar | Apr 2007 | B2 |
7222237 | Wuidart et al. | May 2007 | B2 |
7290263 | Yip et al. | Oct 2007 | B1 |
7437659 | Taniwaki et al. | Oct 2008 | B2 |
7480446 | Bhadkamkar et al. | Jan 2009 | B2 |
7496845 | Deutscher et al. | Feb 2009 | B2 |
7500175 | Colle et al. | Mar 2009 | B2 |
7634739 | McCrossan et al. | Dec 2009 | B2 |
7721308 | Finger et al. | May 2010 | B2 |
7729598 | Ikeda et al. | Jun 2010 | B2 |
7861150 | Colle et al. | Dec 2010 | B2 |
8238722 | Bhadkamkar et al. | Aug 2012 | B2 |
20010005208 | Minami et al. | Jun 2001 | A1 |
20010054180 | Atkinson | Dec 2001 | A1 |
20010056504 | Kuznetsov | Dec 2001 | A1 |
20010056580 | Seo et al. | Dec 2001 | A1 |
20020038257 | Joseph et al. | Mar 2002 | A1 |
20020091837 | Baumeister et al. | Jul 2002 | A1 |
20020099738 | Grant | Jul 2002 | A1 |
20020099952 | Lambert et al. | Jul 2002 | A1 |
20020118220 | Lui et al. | Aug 2002 | A1 |
20020138593 | Novak et al. | Sep 2002 | A1 |
20020157103 | Song et al. | Oct 2002 | A1 |
20020170005 | Hayes | Nov 2002 | A1 |
20020188616 | Chinnici et al. | Dec 2002 | A1 |
20030025599 | Monroe | Feb 2003 | A1 |
20030026398 | Duran et al. | Feb 2003 | A1 |
20030076328 | Beda et al. | Apr 2003 | A1 |
20030078930 | Surcouf et al. | Apr 2003 | A1 |
20030093792 | Labeeb et al. | May 2003 | A1 |
20030142137 | Brown | Jul 2003 | A1 |
20030152904 | Doty, Jr. | Aug 2003 | A1 |
20030174160 | Deutscher et al. | Sep 2003 | A1 |
20030182364 | Large et al. | Sep 2003 | A1 |
20030182624 | Large | Sep 2003 | A1 |
20030187801 | Chase | Oct 2003 | A1 |
20030204511 | Brundage et al. | Oct 2003 | A1 |
20030204613 | Hudson et al. | Oct 2003 | A1 |
20030210270 | Clow | Nov 2003 | A1 |
20030231863 | Eerenberg et al. | Dec 2003 | A1 |
20040001706 | Jung et al. | Jan 2004 | A1 |
20040027259 | Soliman et al. | Feb 2004 | A1 |
20040034622 | Espinoza et al. | Feb 2004 | A1 |
20040034795 | Anderson et al. | Feb 2004 | A1 |
20040039834 | Saunders et al. | Feb 2004 | A1 |
20040039909 | Cheng | Feb 2004 | A1 |
20040049793 | Chou | Mar 2004 | A1 |
20040068510 | Hayes et al. | Apr 2004 | A1 |
20040107179 | Dalrymple, III | Jun 2004 | A1 |
20040107401 | Sung et al. | Jun 2004 | A1 |
20040111270 | Whitham | Jun 2004 | A1 |
20040123316 | Kendall et al. | Jun 2004 | A1 |
20040133292 | Sakurai et al. | Jul 2004 | A1 |
20040143823 | Wei | Jul 2004 | A1 |
20040148514 | Fee et al. | Jul 2004 | A1 |
20040153648 | Rotholtz et al. | Aug 2004 | A1 |
20040153847 | Apte et al. | Aug 2004 | A1 |
20040156613 | Hempel et al. | Aug 2004 | A1 |
20040187157 | Chong et al. | Sep 2004 | A1 |
20040190779 | Sarachik et al. | Sep 2004 | A1 |
20040205478 | Lin et al. | Oct 2004 | A1 |
20040205479 | Seaman et al. | Oct 2004 | A1 |
20040210824 | Shoff et al. | Oct 2004 | A1 |
20040220926 | Lamkin et al. | Nov 2004 | A1 |
20040221311 | Dow et al. | Nov 2004 | A1 |
20040223740 | Itoi | Nov 2004 | A1 |
20040228618 | Yoo et al. | Nov 2004 | A1 |
20040243927 | Chung et al. | Dec 2004 | A1 |
20040244003 | Perfetto et al. | Dec 2004 | A1 |
20040247292 | Chung et al. | Dec 2004 | A1 |
20040250200 | Chung et al. | Dec 2004 | A1 |
20040267952 | He et al. | Dec 2004 | A1 |
20040268224 | Balkus et al. | Dec 2004 | A1 |
20050015815 | Shoff et al. | Jan 2005 | A1 |
20050022116 | Bowman et al. | Jan 2005 | A1 |
20050047754 | Jung et al. | Mar 2005 | A1 |
20050088420 | Dodge et al. | Apr 2005 | A1 |
20050091574 | Maaniitty et al. | Apr 2005 | A1 |
20050114896 | Hug et al. | May 2005 | A1 |
20050125741 | Clow et al. | Jun 2005 | A1 |
20050132266 | Ambrosino et al. | Jun 2005 | A1 |
20050140694 | Subramanian et al. | Jun 2005 | A1 |
20050149729 | Zimmer et al. | Jul 2005 | A1 |
20050183016 | Horiuchi et al. | Aug 2005 | A1 |
20050190947 | Dulac | Sep 2005 | A1 |
20050244146 | Tsumagari et al. | Nov 2005 | A1 |
20050251732 | Lamkin et al. | Nov 2005 | A1 |
20050289348 | Joy et al. | Dec 2005 | A1 |
20060020950 | Ladd et al. | Jan 2006 | A1 |
20060041522 | Rodriguez-Rivera | Feb 2006 | A1 |
20060083486 | Kanemaru et al. | Apr 2006 | A1 |
20060123451 | Preisman | Jun 2006 | A1 |
20060136914 | Marascio et al. | Jun 2006 | A1 |
20060140079 | Hamada et al. | Jun 2006 | A1 |
20060269221 | Hashimoto et al. | Nov 2006 | A1 |
20060274612 | Kim | Dec 2006 | A1 |
20070002045 | Finger et al. | Jan 2007 | A1 |
20070005757 | Finger et al. | Jan 2007 | A1 |
20070005758 | Hughes, Jr. et al. | Jan 2007 | A1 |
20070006061 | Colle et al. | Jan 2007 | A1 |
20070006063 | Jewsbury et al. | Jan 2007 | A1 |
20070006078 | Jewsbury et al. | Jan 2007 | A1 |
20070006079 | Hayes et al. | Jan 2007 | A1 |
20070006080 | Finger et al. | Jan 2007 | A1 |
20070006233 | Finger et al. | Jan 2007 | A1 |
20070006238 | Finger et al. | Jan 2007 | A1 |
20070033419 | Kocher et al. | Feb 2007 | A1 |
20070122122 | Okamoto et al. | May 2007 | A1 |
20070174387 | Jania et al. | Jul 2007 | A1 |
20070198834 | Ksontini et al. | Aug 2007 | A1 |
20070277245 | Goto et al. | Nov 2007 | A1 |
20080126974 | Fawcett et al. | May 2008 | A1 |
20090007160 | Wei | Jan 2009 | A1 |
20110004943 | Chaganti et al. | Jan 2011 | A1 |
Number | Date | Country |
---|---|---|
2340144 | Sep 2001 | CA |
1345119 | Sep 2003 | EP |
1473618 | Nov 2004 | EP |
1551027 | Jul 2005 | EP |
1641259 | Mar 2006 | EP |
2344925 | Jun 2000 | GB |
2000-098999 | Apr 2000 | JP |
2001-022498 | Jan 2001 | JP |
2003-284003 | Oct 2003 | JP |
2004-007610 | Jan 2004 | JP |
2004-086551 | Mar 2004 | JP |
2004-221900 | Aug 2004 | JP |
2005-149394 | Jun 2005 | JP |
20030074093 | Sep 2003 | KR |
0217179 | Feb 2002 | WO |
02091178 | Nov 2002 | WO |
02103496 | Dec 2002 | WO |
02103496 | Dec 2002 | WO |
03062969 | Jul 2003 | WO |
03077249 | Sep 2003 | WO |
2004025651 | Mar 2004 | WO |
2005002219 | Jan 2005 | WO |
2005029842 | Mar 2005 | WO |
2005020236 | Mar 2005 | WO |
2005029842 | Mar 2005 | WO |
2005048261 | May 2005 | WO |
WO 2005048592 | May 2005 | WO |
2005052940 | Jun 2005 | WO |
2005122530 | Dec 2005 | WO |
2005122530 | Dec 2005 | WO |
Entry |
---|
Evans, Mark, “Lambda the Ultimate” Sep. 7, 2003, DP-COOL 2003 Proceedings, lambda-the-ultimate.org/classic/message8639.html. |
International Search Report dated Feb. 26, 2007 for Application No. PCT/US2006/024155, 8 pages. |
Non Final Office Action for U.S. Appl. No. 11/405,736, dated May 18, 2009, 12 pages. |
Cesar et al., “Open Graphical Framework for Interactive TV”, IEEE Fifth International Symposium on Multimedia Software Engineering (ISMSE'03) p. 21, accessed at http://doi.ieeecomputersociety.org/10.1109/MMSE.2003.1254418 on Sep. 20, 2005. |
Apple, Compressor 2 User Manual, Apr. 25, 2005, Apple, select pages (3 in total). |
Apple, Final Cut Pro 5 User Manual, May 11, 2005, Apple, select pages (73 in total). |
International Search Report for PCT/US06/23905 of Jun. 20, 2006, 7 pages. |
Non-Final Office Action for U.S. Appl. No. 11/355,209, dated May 15, 2009, 12 pages. |
International Search Report for PCT/US06/23911, mailed on Jun. 3, 2008, 8 pages. |
International Search Report for PCT/US2006/023907, dated Mar. 2, 2007, 7 pages. |
International Search Report for PCT/US2006/023906, dated Nov. 20, 2006, 6 pages. |
International Search Report for PCT/US06/24034, dated Jun. 20, 2008, 10 pages. |
International Search Report for PCT/US2006/024225, dated Feb. 26, 2006, 7 pages. |
International Search Report for PCT/US2006/024226 dated Jun. 22, 2006, 7 pages. |
Non Final Office Action for U.S. Appl. No. 11/405,737, dated May 18, 2009, 11 pages. |
Non Final Office Action for U.S. Appl. No. 11/354,800, dated Sep. 13, 2007, 8 pages. |
Non Final Office Action for U.S. Appl. No. 11/355,609, dated Jun. 5, 2009, 12 pages. |
Non Final Office Action for U.S. Appl. No. 11/350,595, dated Jun. 26, 2009, 4 pages. |
Non Final Office Action for U.S. Appl. No. 11/351,085, dated Jun. 1, 2009, 14 pages. |
Non Final Office Action for U.S. Appl. No. 11/354,800, dated Jul. 15, 2008, 7 pages. |
Final Office Action for U.S. Appl. No. 11/354,800, dated May 1, 2009, 8 pages. |
Advisory Action for U.S. Appl. No. 11/352,575, dated Jul. 14, 2009, 3 pages. |
Final Office Action for U.S. Appl. No. 11/352,575, dated Apr. 28, 2009, 19 pages. |
Non Final Office Action for U.S. Appl. No. 11/352,575, dated Sep. 2, 2008, 15 pages. |
Non Final Office Action for U.S. Appl. No. 11/352,662, dated Jun. 2, 2009, 12 pages. |
Anderson et al., “Building multiuser interactive multimedia environments at MERL”, Multimedia, IEEE vol. 2, Issue 4, Winter 1995 pp. 77-82, accessed at http://ieeexplore.ieee.org/search/wrapper.jsp?arnumber=482298 on Sep. 30, 2005. |
Borchers et al., “Musical Design Patterns: An Example of a Human-Centered Model of Interactive Multimedia”, 1997 International Conference on Multimedia Computing and Systems (ICMCS'97) p. 63, accessed at http://doi.ieeecomputersociety.org/10.1109/MMCS.1997.609564 on Sep. 30, 2005. |
Fritzsche, “Multimedia Building Blocks for Distributed Applications”, 1996 International Workshop on Multimedia Software Development (MMSD '96) p. 0041, accessed at http://doi.ieeecomputersociety.org/10.1109/MMSD.1996.557742 on Sep. 30, 2005. |
International Search Report for PCT/US2006/024294 dated Apr. 30, 2007, 7 pages. |
International Search Report for PCT/US2006/024423 dated Apr. 24, 2007, 5 pages. |
International Search Report for PCT/US2006/024292 dated May 7, 2007, 7 pages. |
C. Peng et al., “Digital Television Application Manager”, Telecommunications Software and Multimedia Laboratory, Helsinki University of Technology, 2001 IEEE International Conference on Multimedia and Expo, 4 pages. |
Non Final Office Action for U.S. Appl. No. 11/352,571 dated May 18, 2009, 7 pages. |
Final Office Action dated Jan. 22, 2010 in related U.S. Appl. No. 11/350,595. 6 pages. |
Final Office Action dated Dec. 1, 2009 in related U.S. Appl. No. 11/352,571, 9 pages. |
International Search Report for PCT/US06/24034 dated Jun. 26, 2008, 1 page. |
Final Office Action dated Jan. 11, 2010 in related U.S. Appl. No. 11/352,662, 13 pages. |
Non-Final Office action dated Oct. 8, 2009 in related U.S. Appl. No. 11/354,800, 18 pages. |
Final Office Action dated Nov. 27, 2009 in related U.S. Appl. No. 11/405,737 12 pages. |
Final Office Action dated Dec. 10, 2009 in related U.S. Appl. No. 11/405,736, 14 pages. |
Final Office Action dated Jan. 25, 2010 in related U.S. Appl. No. 11/351,085, 14 pages. |
International Multimedia Conference; vol. 9 Proceedings of the Ninth ACM International Conference on Multimedia, Poellauer, Schwan, West, pp. 231-240, 2001. |
Non-Final Office Action dated Dec. 2, 2009 in related U.S. Appl. No. 11/352,575, 23 pages. |
Z-Order Correction Algorithm for Dialog Boxes, IBM Technical Disclosure Bulletin, IBM Corp. New York, US, vol. 37, No. 8, Aug. 1, 1994. |
International Search Report for PCT/US06/24292 dated May 7, 2007, 1 page. |
Author Unknown, MHP Project Office: “Digital Video Broadcasting (DVB); Multimedia Home Platform (MHP) Specification 1.1.2” Apr. 25, 2005. Part 1, 405 pages. |
Author Unknown, MHP Project Office: “Digital Video Broadcasting (DVB); Multimedia Home Platform (MHP) Specification 1.1.2” Apr. 25, 2005. Part 2, 405 pages. |
Author Unknown, MHP Project Office: “Digital Video Broadcasting (DVB); Multimedia Home Platform (MHP) Specification 1.1.2” Apr. 25, 2005. Part 3, 362 pages. |
Final Office Action dated Dec. 11, 2009 in related U.S. Appl. No. 11/355,209, 16 pages. |
Non-Final Office Action for U.S. Appl. No. 11/405,737, dated Apr. 30, 2009, 11 pages. |
Ben Willmore, “Adobe Photoshop CS Studio Techniques”, Pub Feb. 11, 2004, Safari Online Books, Ch. 3 Section titled “Layers Primer”, pp. 104-110, Adobe Press. |
Dekeyser, S. et al., “Path locks for XML document collaboration”, Web Information Systems Engineering, 2002. Wise 2002. Proceedings of the Third International Conference 2002, Piscataway, NJ, USA, IEEE, 12 2002, pp. 105-114. |
Pihkala, K. et al., “Design of a dynamic smil player”, Multimedia and Expo, 2002. ICME '02. Proceedings. 2002 IEEE International Conference on Lausanne, Switzerland 2002, Piscataway, NJ, USA, IEEE, US, vol. 2, 26 2002, pp. 189-192. |
Non-Final Office Action for U.S. Appl. No. 11/405,736, dated May 1, 2009, 10 pages. |
“International Search Report” for PCT/US06/23907, dated Apr. 26, 2007, 1 page. |
Non-Final Office Action for U.S. Appl. No. 11/352,571, dated Apr. 30, 2009, 7 pages. |
“International Search Report”, PCT/US06/23912, Jun. 19, 2008, 13 pages. |
Non-Final Office Action for U.S. Appl. No. 11/405,816, dated Jun. 3, 2008, 12 pages. |
Benekidt, M. et al., “Managing XML Data: An Abridged Overview”, Computing in Science and Engineering, IEEE Service Center, Los Alamitos, CA, US, vol. 6, No. 4, Jul. 2004, pp. 12-19. |
Barton C. et al., “Streaming XPath processing with forward and backward axes”, Proceedings 19th International Conference on Data Engineering, Bangalore, India, Conf. 19, 2003, p. 455-466. |
Blu-ray Disc Association: “Application Definition Blu-ray Disc Format BD-J Baseline Application and Logical Model Definition for BD-ROM”, Internet Citation, Mar. 1, 2005, retrieved Jun. 18, 2008, http://www.blurayjukebox.com/pdfs/bdj—gem—application—definition—0503 07-13404, pp. 1-45. |
Blu-Ray Disc: “White paper Blu-ray Disc Format. 2.B Audio Visual Application Format Specifications for BD-ROM”, Internet Citation, Mar. 2005, retrieved Nov. 16, 2007, http://www.blu-raydisc.com/assets/downloadablefile/2b—bdrom—audiovisualapplication—0305-12955-13403.pdf, pp. 1-26. |
Juliana Freire, et al., “Managing XML Data: An Abridged Overview”, Computing in Science & Engineering, 2004 IEEE, pp. 12-19 (8 total pages). |
Notice of Preliminary Rejection with English Language translation issued Jul. 11, 2013, in connection with Korean Patent Application No. 10-2007-7030958 (7 pages total). |
“System performance measure in solaris”, Aug. 12, 2004 (retrieved from http://blog.naver.com/avicom/120004914987) with English Language translation (10 pages total). |
Slingerland et al., “Cache Performance for Multimedia Applications”, ICS '01: Proceedings of the 15th International Conference on Supercomputing, Jun. 2001, pp. 1-14. |
U.S Final Office Action mailed Jan. 16, 2014 in connection with corresponding U.S. Appl. No. 11/405,737 (23 pages total). |
Foxboro® An Invensys Company; I/A Series® Software Logic (LOGIC) Block; 1994, PSS 21S-3L5 B4; pp. 1 and 2 (2 pages total). |
Number | Date | Country | |
---|---|---|---|
20070006063 A1 | Jan 2007 | US |
Number | Date | Country | |
---|---|---|---|
60695944 | Jul 2005 | US |