Embodiments described herein include a service that enables devices to utilize various web services independent of the communication protocol and format utilized by the devices.
In particular, devices that are capable of media playback and/or web browsing typically use communication protocols such as XML, EBIF (set-top boxes), REST or JSON. For such devices, content publishers make available various forms of content, often to supplement or augment the user's playback and/or web browsing experience. For example, cinematic studios often publish movie previews and bonus clips on websites, which can be downloaded and viewed by a device that uses an XML communication format (e.g. ANDROID device), EBIF (set-top box) or JSON. Under conventional approaches, the content publishers are required to address the various communication protocols and formats used by the various devices individually. In contrast, embodiments described herein include a service that is capable of interfacing a device using any of multiple possible communication protocols with a web service that may or may not handle the particular communication protocol in question.
Moreover, under embodiments described, the web service providers may update functionality and content provided without having to address various possible communication protocols that are in use. Accordingly, otherwise incompatible devices that normally would not communicate with each other may be able to connect, communicate and/or share information.
As used herein, BD refers to Blu-ray Disc®, and BD specification refers to the various documents that define the behavior and requirements of BD players, software and related systems, and, in particular, “System Description Blu-ray Disc Read-Only Format: Part 3. Audio Visual Basic Specifications”. The BD specification includes a section pertaining to virtual file system (“VFS”) updates.
In some embodiments, reference is made to playback devices that operate as a Blu-ray player. A Blu-ray player generally supports a virtual file system (VFS) that utilizes files located on an optical disc itself, as well as files that reside in a local storage area of the playback device. The Blu-ray environment generally permits updates to the environment by way of new files and content. With reference to embodiments described herein, a Blu-ray player is able to retrieve information and data from a network, under programmatic instructions that originate from logic or software operating on the player, as well as those executing with the disc. In such implementations, the Blu-ray player may be configured to access and retrieve metadata for presentation in connection with media content that is provided at least in part from the optical disc. In this way, the original contents and functionality of the optical disc can be augmented through the addition of content, namely, audio and video files, application files, or data files, such as those downloaded from a network server. Information that updates the playback environment of the player is received through the VFS of the playback device. With further reference to the Blu-ray environment, the service may perform functions, such as dynamically generating a manifest for the data communicated to the player, as well as a digital signature that is sufficiently in compliance with Blu-ray Specifications (e.g. See System Description Blu-ray Disc Read-Only Format: Part Audio Visual Basic Specifications”). In a Blu-ray platform the manifest defines modifications that are to be made to a file system in the playback device. More specifically, the manifest file maps files located on the Blu-ray disc and in local storage to their relative positions and names in the virtual file system.
One or more embodiments described herein provide that methods, techniques and actions performed by a computing device are performed programmatically, or as a computer-implemented method. Programmatically means through the use of code, or computer-executable instructions. A programmatically performed step may or may not be automatic.
One or more embodiments described herein may be implemented using programmatic modules or components. A programmatic module or component may include a program, a subroutine, a portion of a program, or a software component or a hardware component capable of performing one or more stated tasks or functions. As used herein, a module or component can exist on a hardware component independently of other modules or components. Alternatively, a module or component can be a shared element or process of other modules, programs or machines.
Furthermore, one or more embodiments described herein may be implemented through the use of instructions that are executable by one or more processors. These instructions may be carried on a computer-readable medium. Machines shown or described with figures below provide examples of processing resources and computer-readable mediums on which instructions for implementing embodiments of the invention can be carried and/or executed. In particular, the numerous machines shown with embodiments of the invention include processor(s) and various forms of memory for holding data and instructions. Examples of computer-readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers. Other examples of computer storage mediums include portable storage units, such as CD or DVD units, flash memory (such as carried on many cell phones and personal digital assistants (PDAs)), and magnetic memory. Computers, terminals, network enabled devices (e.g. mobile devices such as cell phones) are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums. Additionally, embodiments may be implemented in the form of computer-programs, or a computer usable carrier medium capable of carrying such a program.
System Description
As used herein, metadata, in the context of movies and programs, refers to information and content that is displayed to the user, but which is not part of the content itself. Rather, such metadata is delivered to the user watching the movie or program as an additional or independent layer.
In
System 100 includes metadata generation subsystem 110 and time metadata service (or services) 120. The metadata generation subsystem 110 processes media files 102 corresponding to movies and/or programs. The output of the metadata generation 110 includes a metadata file 111, which includes a combination of time stamps 113 and associated information 115. The metadata file 111 includes the time stamps 113 by which the sync time triggers 131 can be programmatically or automatically activated by a playback device in order to cause the rendering of associated information or content items for a particular trigger.
Within the structure of the timeline for the media file, the time stamps 113 are timed to coincide with the occurrence of a particular event in the media content. Each time stamp 113 may signify the beginning of an event, and last for a duration that coincides with the ending of the event. For example, the introduction of an actor into a scene may signify the beginning of a particular time stamp 113, and the actor's exit from the scene may signify the end of the event associated with the time stamp 113. Similarly, a time stamp 113 may be associated when music or a particular song is played in the background of the media content, and a time stamp 113 may have the duration equivalent to the length of the song (e.g., the duration of time the music is playing). One frame or series of frames in the media content can be associated with multiple time stamps 113. Furthermore, each time stamp 113 is associated with a corresponding item of information or content 115. Such information or content 115 can be displayed over the movie/program content on a playback device, or on a separate, paired device, as a mechanism for providing the viewer with additional information.
In some variations, the information and content displayed to the user can be interactive. For example, the user may select a link presented with the information or content item 115 as a mechanism for viewing additional information, or partaking in a supplemental activity (e.g. make online purchase).
In order to generate the metadata file 111, one or more embodiments may include a combination of manual and programmatic resources comprising the metadata generation system 110. In one embodiment, a programmatic/manual enhancement 112 component may be used to facilitate operators to identify time stamps 113, and associated information or content 115. With reference to an embodiment of
The user interface tool 800 may also include one or more event columns 814. For example, the event columns 814 may include a list of actors, as well as other events relating to the media content.
As also shown in
According to some embodiments, the metadata generation subsystem 110 can also include programmatic resources to automate the generation of some or all of the metadata. In particular, programmatic automation 114 may be used to identify some or all of the time stamps 113, as well as to generate some or all of the information or content items associated with individual time stamps 113. In one implementation, the programmatic automation 114 uses a script for the media content in order to determine when events occur in the media content (e.g. the introduction of a character into a scene), what the event pertains to (e.g. the character involved), and what the associated information or content item 115 should include (e.g. the name of actor who plays the character of the event). As an alternative or addition to the script, closed-captioned text associated with the media content can be similarly used.
Still further, some embodiments may use image analysis to identify persons or objects appearing in the content. For example, individual frames of the media content may be analyzed for faces, which are then compared to a facial recognition database of actors, in order to determine who the actor is that appears in the content.
As additional resources, the interface tool 800 can include information such as the cast and the soundtrack list. When an operator creates a time stamp 813, the operator can create a record that contains or references information about the particular event. The information or content items can be listed for the operator to edit (see list of records 818).
The time based metadata service 120 may utilize the metadata file 111 in one or more operational environments. In particular, the service 120 may provide (i) media content that is packaged with time based metadata (as generated by the metadata generation subsystem 110); or (ii) time based metadata in connection with media that is to be played back or rendered from a media source independent of the service 120.
In one operational environment, the service 120 combines the time based data 113 and media content 121 for broadcast and/or distribution. An intermediate service or step 122 may retransmit or deliver the combined time based data 113 and media content 123 to a playback device 118. For example, a combined transmission time based metadata and media content 121 may be supplied to a cable head end, which in turn broadcasts (or retransmits) the combined time based data 113 and media content 123 to its user population. As another example, the time based data and media content 121 may be stored onto media (e.g. Blu-ray disc) and distributed through Blu-ray disc distribution channels. Other forms of media may also be used to store the combined time based data and media content 121. For example, the combination of time based data and media content 121 may be supplied as a download from a website. Still further, the playback device 118 may correspond to a set-top box, and the intermediate service or step 122 may correspond to a cable head end that delivers the combined time based metadata and media 123 to the set-top box in the appropriate format (e.g. EBIF format).
In another operational environment, the service 120 operates to supply the time based metadata 125 independent of the media content that pertains to the metadata. Accordingly, for a particular media content, the service 120 supplies a delivery node or trigger 126 with the time based metadata 125 and media content identifier. The delivery node 126 may include a broadcast or distribution mechanism 127, and an application or resource 129 to integrate or otherwise combine the metadata 125 with the underlying media content. In particular, the metadata 125 may be structured to provide time stamps 113 and information or content items 115 all along a timeline that is based on the events that occur during the playback or rendering of the associated media content. The integration resource 129 may synchronize the metadata 125 with the media content in retransmitting, or otherwise making available (e.g. download), in enhanced form, the media content 130 that includes metadata.
In this way, the delivery node or trigger 126 is able to deliver to a playback device 128, a combination of integrated content or enhanced content 130 that includes the time based metadata and the associated content.
Various mediums of playback may be encompassed by an operational mode such as described with delivery node 126. For example, the delivery node 126 may correspond to a cable head end, and the playback device 128 may correspond to (or be provided through) a set-top box. The integrated content 130 may be delivered as an EBIF formatted content.
In some variations, the distribution node 126 acquires the time based metadata for particular media content on a need to or real time basis. For example, in an implementation in which the playback device 128 is provided as or through a set-top box, one more embodiments enable the set-top box or cable head end to request the time based metadata based on user input. In such an implementation, the distribution node 126 may supply the media content identifier to the service 120, in order to receive the time based metadata 125 for the corresponding media content.
In another operational environment, the service 120 supplies the time based metadata 143 to a playback device 138 that receives the media content from another source. In particular, a media source 132 may correspond to, for example, a cable head end, or a website for downloading content, or a media that contains stored media. In the latter case, the media source 132 and media playback device 138 may be provided from the same device of the user.
The user may operate the playback device 138 to render media content 135 from the media source 132. According to some embodiments, the end user can elect to enhance the media playback with the time based metadata 143 provided through service 120. In one implementation, the user may make a request 141 for time based metadata information from the service 120. The request 141 may specify the media content the user is about to watch. As an alternative or variation, the request 141 may be generated from the media source 132, which responds to programmatic conditions and/or user input.
Upon receiving the request, the service 120 obtains synchronization timing information 131 from the media source. In an embodiment in which the media source 132 corresponds to, for example, a cable head end or website, the service 120 may query the media source 132 by specifying the media ID 133 for the media content that is to be rendered on the playback device 138. The media source in turn may provide the service 120 with the synchronization timing information 131. The synchronization timing information 131 may include information that precisely indicates when a content item is to initiate, or where (e.g. which scene or frame) in the timeline of a particular content is being rendered at the time of the query. As an alternative or variation, the service 120 may query, for example, the user's playback device 138, which can include media corresponding to the media source. For example, in some Blu-ray environments, the Blu-ray player may communicate with service 120 over network in order to receive enhanced content in the form of time based meta-data 143. In some embodiments, the playback device may be a television.
In some implementations, the media source 132 may supply the timing information 131 as a one-time event for the particular media content, thus initiating a clock or counter on the service 120. Alternatively, the timing information 131 may be provided repeatedly, or on an ongoing basis. With this information, the service 120 supplies time based metadata 143 to the playback device 138 in a manner that synchronizes the timeline of the metadata 143 with that of the media content.
In another operational environment, the service 120 supplies enhanced content in the form of time based metadata and the associated media content to a user end device for playback. The user and device may correspond to, for example, a tablet or personal computer that connects to the service 120 (or its affiliates or partners) via, for example, a website or web service. In some variations, the playback device 148 may correspond to, for example, a Blu-ray player or television which receives content and metadata from over a network.
In the operational environment depicted, the user of the playback device 148 makes a media request 147 from the service 120. The media request 147 may identify an underlying media content, such as a movie or program. The service 120 supplies enhanced media 149 in that it includes both media and its associated time based metadata. The service 120 may pre-synchronize the time based metadata with the underlying media content so as to transmit the content and metadata concurrently or at one time. The playback device 148 may include programming resources that enable it to render both the media content and the information or content item included with the metadata.
As a variation, the service 120 may be configured to supply the time based metadata 157 to a paired device 160, in connection with corresponding media being rendered or played back on a user playback device 158. The user playback device 158 may correspond to a media source device, such as a Blu-ray player that plays back content from a disk, or tablet that plays back the content from the stored media file. Alternatively, the playback device 158 may receive the media 155 from the service 120. Still further, the playback device 158 may receive the media 155 from any media source, such as a cable head end or website.
In the example shown, the media playback device 158 receives the media from the service 120. Rather than communicating the time based metadata 157 to the media playback device 158, the metadata 157 may be communicated to the paired device 160. In one embodiment, the paired device 160 makes the request 161 for time based media data of an identified media content. The paired device 160 may be configured to identify itself as being a paired device, rather than a playback (or primary playback) device. Alternatively, the service 120 may be configured to identify from the request 161 that the paired device 160 is to receive the time based metadata 157 information for an identified media content, in which the media content is played back on a different device or medium. In response to the request 161 the service 120 supplies a time based metadata 157 to the paired device 160.
In an embodiment in which the service 120 provides the media 155, the service controls transmission of both the media content and the metadata. As such, the service 120 is able to synchronize the delivery of metadata information to the paired device 160, while transmitting media 155 to the playback device 158.
In an embodiment in which the media 155 is transmitted from an alternative media source, such as a cable head end or website, the service 120 may receive synchronization timing information for media content being played back, so that the delivery of the time based metadata 157 to the paired device 160 is synchronized so that the timelines of the metadata and the media content are aligned. The synchronization timing information may be received by querying the media source (e.g. cable head end), as similarly described with other embodiments. Alternatively, the media playback device 158 may provide the necessary synchronization timing information, by initiating playback of the media content upon the paired device 160 receiving the time based metadata 157.
In an embodiment in which the media 155 is stored or otherwise local to the playback device 158, the paired device 160 may be operated to request the time based metadata 161 that is associated with the media content that is to be played back on the device 158. For example, the user may operate the paired device 160 to identify the media content that is to be played back on the media playback device 158. Alternatively, the two devices 158, 160 may communicate with one another across, for example, a local communication port (e.g. Bluetooth or WiFi). Still further, the playback device 158 may be triggered to contact the service 120 in response to a playback request from a user (e.g. insertion or play command of a Blu-ray disc, play command for stored media file on a tablet etc.).
In response to receiving the request 161, service 120 may query the media playback device 158 for synchronization timing information, in response to receiving the request 161. For example, a user may be associated with an account on the service 120. Once the user accesses the service 120, the service 120 locates and communicates with both of the user devices, thus enabling identification of the paired device for receiving the time based metadata 157.
Still further, in some variations, the time based metadata 157 may be communicated to the paired device 160 from the playback device 158. For example the playback device 158 may synchronize both the time based metadata and the media content by receiving respective transmissions of one or both from an external source (e.g. content source or service 120), and then retransmitting the metadata to the paired device 160. The transmission of the metadata 157 to the paired device 160 can be done through a communication medium such as, for example, Bluetooth or other short radiofrequency wave communication channels.
With regard to embodiments described, time based metadata service(s) 120 may also be configured to map multiple media playback devices to each other thereby enabling the devices to share metadata and/or associated content, even if one or more of the playback devices are not capable of communicating directly with each other.
As described with
In another embodiment such as illustrated in
In some embodiments, time metadata service(s) 120 is configured to associate or map the content (e.g. as received from content service(s) 195) to the metadata file 111 and provide the newly associated content to one or more playback devices. For example, the metadata file 111 may be received via a metadata services(s) communication 170 of the time metadata service(s) 120 and provided to a metadata content mapping component 171. The mapping component 171 then associates or maps the metadata 111 with content (e.g., movie or programming content) received from the content service(s) 195 via content service(s) communication 173. In some embodiments, the metadata 111 includes time stamps that correlate to scenes, frames and/or events in the content. The metadata to content mapping 171 programmatically links the time stamps and the associated metadata to the run time of the corresponding content. For example, one file, or two linked files may be created that can be played simultaneously by a playback device in order to render the content (e.g. movie) and time based metadata at appropriate times (e.g. frame indicated by the time stamp).
The service 120 communicates the mapped metadata/content to a playback device using, for example, a delivery node 126 such as described above with respect to
The service 120 may also include a user accounts management service 175. The service 120 may associate individual users with accounts, which can incorporate pay for service functionality or other capabilities requiring the activation of an account. For example, functionality provided via the service 120 can be provided via a pay per view or monthly subscription basis. As an addition or alternative, account information can carry device and user profile information of a particular user. Such information may include (i) contact information about the user; (ii) one or more identifiers as to a type or types of playback device(s) the user is utilizing; (iii) historical data indicating the interactions of the user with a particular media file; (iv) linked accounts for other web services which the user may have registered to; and/or (v) preferences of the user regarding the output and display of the supplemental content.
According to some embodiments, service 120 provides programmatic enhancements to enable third-parties (e.g. advertisers) to identify appropriate topics and information to supply with content. The service 120 includes a programmatic service communication (PSC) 176 that provides an interface for third-parties (e.g. advertisers, content owners, movie studios or sponsors) to research content and time based metadata from a library or collection of media files (e.g. movie library of studio). The PSC 176 can be used by the third-party (e.g. via service 190) in order to specify additional metadata content, such as, for example, promotional metadata content. For example, the advertiser may use the PSC 176 to specify overlay ads or content that relate to a particular commercial product, or, conversely, a particular commercial product approved for display with a particular media asset. The advertiser may also specify the media content, or supply information to have the media content selected programmatically. The programmatic mapping component 177 can use information provided from the advertiser to perform one or more of the following: (i) select the media content on which the advertiser content is to be displayed, (ii) select the scene(s) or events of the media content, and/or (iii) identify the particular time stamp within the media content that is to serve as the stamp for the supplied content.
As an addition or alternative, the programmatic mapping 177 may include logic (algorithms) to scan media content, its associated metadata and/or time base metadata in order to identify one or more semantic categories corresponding to the media being viewed by the user. This information may be communicated to the programmatic service(s) 190. The service can enable manual or programmatic selection of supplemental metadata content, in connection with time based metadata.
For example, if the user is watching a program or movie about dogs and/or one or more of the time stamps is identified as corresponding to dogs, mapping 177 can communicate the information to the programmatic service(s) 190. The service 190 may then determine which supplemental content (e.g., advertisements regarding dog food, dog breeders, etc.) can be added to the media and provided to the user.
In certain embodiments, service 120 may also include a device identification service 178 configured to identify a type of playback device utilized by a user. For example, the device identification service 178 may determine that a user has access to, or is using, a particular device such as, for example, tablet computer, Blu-ray player, set-top box, etc., as well as the communication format for each device (such as determined by device communications 179). In one or more embodiments, once the device information has been determined, the information may be associated with a user and stored using the user accounts management 175. Device identification may also trigger the service 120 to operate in one of multiple possible computing environments, as described with prior embodiments.
Service 120 may also include mapping tool 180 configured for associating content received from programmatic mapping service(s) 190 to metadata 111. An exemplary mapping tool 180 is shown in
Referring to
Referring back to
In one embodiment, the imported metadata and related time stamps may be imported in a first format (e.g., the format in which they were created) and transferred to a playback as is. In another embodiment, the time stamps and metadata are programmatically processed to match the structure of the metadata file 111 and/or the playback device to which the information will be sent.
In another embodiment illustrated by
Methodology
The media content is processed in order to identify the occurrence of events for which supplemental information or content can be provided (220). Various kinds of events may be detected in the media content. These include, for example, events pertaining to (i) an actor or character (222), (ii) the use of objects (224), such as commercial objects, (iii) the playing of songs or supplemental media (e.g. television show in background) as part of the media content, and/or (iv) depiction of locations (226) (e.g. landmarks, geographic regions etc.).
The events of the media content may be individually associated with time stamps that are included as part of an associated metadata file (230). The metadata file is structured to reflect a timeline of the media content. The occurrence of an event in the media content may be signified by a trigger, which is marked in the metadata file to coincide with the occurrence of the event at the playtime of the media content. In this way, individual triggers are associated with corresponding events of the media content.
Once events are identified, the events may be linked to pertinent information or content (240). For example, time stamps that signify the presence of an actor or character in a scene of the media content may be linked to a data item that includes additional information about the actor, the character or the scene. Each time stamp can be linked to a data file that contains additional information pertaining to the event signified by the time stamp. The data file may be created programmatically and/or manually. For example, if an actor is signified by a time stamp, programmatic resources may automatically identify a profile of the actor from a movie database and include information (or a link) to the movie profile as part of the data item that is associated with the time stamp. As an alternative or addition, an operator may manually edit or create content for the data item that is associated with the trigger.
In some implementations, metadata generation subsystem 110 processes a media content in order to detect events that occur as part of the media content. The metadata generation sub-system 110 associates the detected events with time stamps, which are in turn associated with timing information that signifies when the events occur during the playback of the media content. Additionally, the detected events can be associated with information or content items that supplement the content during portions that coincide with the placement of a corresponding time stamps (signifying when the event is taking place as part of the content). In one implementation, the metadata generation sub-system 110 includes a manual editor interface (e.g. see
As an alternative or addition, the process of event detection and subsequent data item association may be performed programmatically and with automation. In one implementation, individual frames that comprise the video portion of the media content are analyzed using image analysis. For example, individual frames are analyzed to detect faces, and to associate the faces with identities of persons (e.g. actors) from a database. Likewise, frames of the media content may be analyzed to detect objects (e.g. commercial objects) and to recognize the objects by type of brand using image analysis. The objects can then be mapped to programmatic services 190 or content services 195.
Once the metadata file is created with time stamps and linked information, the metadata file is made available to manual operators. In certain embodiments, the manual operators may utilize tools shown and described with respect to
The time stamps and metadata are then communicated to playback devices and/or media sources for use with playback of the associated media file (250). Depending on the implementation, the contents of the metadata file can be transmitted in real-time, to coincide with playback of the associated media content on a playback device. Alternatively, the contents of the metadata file can be transmitted and stored in a playback environment, for use when the associated media content is played back from a particular source or on a paired device 160 at a later time. For example, in a cable/set-top box environment, the metadata file and/or its contents can be transmitted to a media source (e.g. cable head-end, website) where it is re-transmitted to a user playback device (e.g. set-top box).
Alternatively, the metadata may be transmitted to the media source (e.g. cable broadcast, website). The receiving media source may combine a media content transmission with the metadata, so as to transmit the media content and the metadata to the user for playback. According to some embodiments, the media source may include programmatic resources for (i) requesting the metadata that is to accompany the media content transmission, and (ii) combining transmission of the metadata with the media content, including synchronizing the delivery of metadata with the transmission of the media content along a timeline that is reflected by the media content.
Still further, the contents of the metadata file can be communicated to the playback device while the playback device receives the media content from another media source (e.g. cable/set top box, website). For example, the playback device may receive the metadata from time metadata service 120, while receiving the coinciding media content from a media source (e.g. cable head end). The service 120 may synchronize the transmission of the metadata using reference timing information for the playback of the media content (e.g. start time). Alternatively, the playback device may synchronize the receipt of metadata from one source with the receipt of media content from another source in creating an enhanced playback containing both the media content and the time based metadata.
In some variations, the contents of the metadata file may be transmitted and stored by, for example, a user of a playback device. The stored file can be accessed and used when the playback device renders the associated media content or file.
As another variation or alternative, the metadata file, or its contents, may be transmitted to a paired device of a user, for use in conjunction with rendering of the associated media content on a playback device. For example, the metadata file may be transmitted over a network to a tablet device that the user can hold and view, while watching the associated media content on a set-top box or Blu-ray environment.
Numerous variations to how the generated metadata is communicated and used are described with, for example, embodiments of
In an embodiment of
In some variations, the media playback device may request and receive the metadata independently from the media source. For example, the media playback device may request the metadata from the service 120. The service 120 may acquire the synchronization timing information from the media source, and deliver metadata in real-time to the playback device. Alternatively, the playback device may receive the metadata in real-time or synchronously, and include programmatic resources for synchronizing the playback of the metadata with the associated media content. Numerous such variations are possible.
The service 120 may respond to the request in real-time, so as to transmit metadata concurrently and in-sync with playback of the media content on the playback device. In responding to the request, service 120 may acquire synchronization timing information in order to transmit the media content in sync (520). The synchronization timing information may correspond to, for example, the exact time that the media content was initiated, or the precise frame or portion of the media content that is rendered at a reference time. In one implementation, the service 120 acquires the synchronization timing information from the media source (522). In another implementation, the service 120 acquires the synchronization timing information from the media playback device (524).
In some variations, the media source may reside on a user device. For example, the media source may correspond to a Blu-ray player. The synchronization timing information may be communicated from the Blu-ray player of the user.
The service 120 may then transmit the metadata information in real-time to the device of the user (530). In one embodiment, the metadata is transmitted to the media playback device of the user (532). In a variation, the metadata is transmitted to a paired device of the user (534). The paired device may correspond to, for example, a tablet, smart phone or laptop, while the media playback device may correspond to a set-top box and television, personal computer, Blu-ray player or Internet television. As an example, the service 120 may maintain an account for the user from which it maintains information to identify and locate the media playback device and the paired device on a network. In another embodiment, the service 120 can broker a request. For example, service 120 may provide response actions to a user interacting with the media, either through the playback device or paired device and connect the user to programmatic services 190 or content services 195.
In
Metadata Placement and Development
Various parties may utilize an interface or tool to develop metadata themes of content for inclusion with a media content. For example, an operator of the metadata generation sub-system 110 may place metadata according to a default criteria (e.g. identify all actors, songs, bibliography etc.). Other operators, such as users of the service 190 may place metadata by themes. For example, the operators may be provided an interface (e.g. See
For example, referring to
Referring to
Second Screen Examples
The interfaces of
In the embodiment of
Link Sharing
With reference to
In some variations, the user input can identify a predefined segment (1114). The predefined segment may have a set of predefined features, for example a predefined start/finish time.
The user playback device 158 and/or the paired device 160 may respond to the user input, by, for example, communicating the identified segment to a service or set top box, etc. (1116). The paired device 1160 can, for example, determine, from user input, the start/end times for a movie, and then communicate the start/end times (along with an identifier of the movie) to the service or set top box.
In some variations, for example, a user can operate the paired device 160 in order to provide input that identifies the desired segment (1118). For example, a user's interaction with a second screen (e.g., touch screen) can be used to identify a movie segment or its start and/or end times. In such an embodiment a second screen interaction 1118 may be provided to a user in response to determine, for example, a predefined segment. Various other input mechanisms and/or interfaces can be used to enable a user to specify an input corresponding to a segment of the movie.
After responding to a user input and/or second screen interaction, a link is generated (1120). The link may be generated to include, for example, data elements that provide or correspond to one or more of the following: (i) data that identifies a location of source file for the movie or movie segment (1122), (ii) start and end times for the segment (1124), (iii) time-based metadata that is associated with the segment of the movie (persons and/or objects that appear or are referenced in the identified segment) (1126). In some variations, an identifier for the movie segment can be identified by the link. As an example, the link can be a selectable hypertext link that, when triggered, causes the identified movie segment 1110 to be played or otherwise displayed. The generated link can be published (1130). This can include publishing the link on a social networking account (1132) (e.g. Facebook, or any social network known in the art) or sharing the link through messaging (1134). For example, the link can be shared by sending the link by e-mail, text message, or another direct form of communication.
In the example provided, the user triggers a selectable feature 1210. The feature 1210 can programmed to, for example, identify a segment, or alternatively provide user input that identifies a start/finish for the movie segment. In the example provided, once the segment is identified, time-based metadata (e.g., related shopping panel 1214) can be displayed to the user. The user input can also be used to enable the user to publish the clip 1215 of the segment. For example, the user can communicate the clip 1215 to the service, either with or without, for example, the time-based or rich metadata.
In
Computer System
Additionally, a metadata generation sub-system 110 may be implemented using a computing system such as shown and described by
In an embodiment, computer system 1300 includes processor 1304, main memory 1306, ROM 1308, storage device 1310, and communication interface 1316. Computer system 1300 includes at least one processor 1304 for processing information. Computer system 1300 also includes a main memory 1306, such as a random access memory (RAM) or other dynamic storage device, for storing information and instructions to be executed by processor 1304. Main memory 1306 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 1304. Computer system 1300 may also include a read only memory (ROM) 1308 or other static storage device for storing static information and instructions for processor 1304. A storage device 1310, such as a magnetic disk or optical disk, is provided for storing information and instructions. The communication interface 1316 may enable the computer system 1300 to communicate with one or more networks through use of the network link 1320.
Computer system 1300 can include display 1312, such as a cathode ray tube (CRT), a LCD monitor, and a television set, for displaying information to a user. An input device 1314, including alphanumeric and other keys, is coupled to computer system 1300 for communicating information and command selections to processor 1304. Other non-limiting, illustrative examples of input device 1314 include a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 1304 and for controlling cursor movement on display 1312. While only one input device 1314 is depicted in
Embodiments described herein are related to the use of computer system 1300 for implementing the techniques described herein. According to one embodiment, those techniques are performed by computer system 1300 in response to processor 1304 executing one or more sequences of one or more instructions contained in main memory 1306. Such instructions may be read into main memory 1306 from another machine-readable medium, such as storage device 1310. Execution of the sequences of instructions contained in main memory 1306 causes processor 1304 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement embodiments described herein. Thus, embodiments described are not limited to any specific combination of hardware circuitry and software.
Although illustrative embodiments have been described in detail herein with reference to the accompanying drawings, variations to specific embodiments and details are encompassed by this disclosure. It is intended that the scope of embodiments described herein be defined by claims and their equivalents. Furthermore, it is contemplated that a particular feature described, either individually or as part of an embodiment, can be combined with other individually described features, or parts of other embodiments. Thus, absence of describing combinations should not preclude the inventor(s) from claiming rights to such combinations.
This application is a continuation of U.S. patent application Ser. No. 13/523,829 filed on Jun. 14, 2012, which claims priority to Provisional U.S. Patent Application No. 61/497,023 filed on Jun. 14, 2011, the entirety of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
6188398 | Collins-Rector et al. | Feb 2001 | B1 |
6357042 | Srinivasan et al. | Mar 2002 | B2 |
6874126 | Lapidous | Mar 2005 | B1 |
7096271 | Omoigui et al. | Aug 2006 | B1 |
7136871 | Ozer et al. | Nov 2006 | B2 |
7302490 | Gupta et al. | Nov 2007 | B1 |
7379661 | Lamkin et al. | May 2008 | B2 |
7421729 | Zenoni | Sep 2008 | B2 |
8042132 | Carney et al. | Oct 2011 | B2 |
8220018 | De et al. | Jul 2012 | B2 |
8381259 | Khosla | Feb 2013 | B1 |
8413205 | Carney et al. | Apr 2013 | B2 |
8448208 | Moreau et al. | May 2013 | B2 |
8549575 | Amento et al. | Oct 2013 | B2 |
8635649 | Ward et al. | Jan 2014 | B2 |
8699862 | Sharifi et al. | Apr 2014 | B1 |
8793256 | McIntire et al. | Jul 2014 | B2 |
8850495 | Pan | Sep 2014 | B2 |
8898698 | Fleischman et al. | Nov 2014 | B2 |
8943533 | De et al. | Jan 2015 | B2 |
9191722 | Alexander et al. | Nov 2015 | B2 |
9432721 | Fleischman et al. | Aug 2016 | B2 |
9762967 | Clarke | Sep 2017 | B2 |
10061742 | Lang et al. | Aug 2018 | B2 |
20010001160 | Shoff et al. | May 2001 | A1 |
20020042920 | Thomas et al. | Apr 2002 | A1 |
20020073424 | Ward et al. | Jun 2002 | A1 |
20020104101 | Yamato | Aug 2002 | A1 |
20020112249 | Hendricks et al. | Aug 2002 | A1 |
20020120925 | Logan | Aug 2002 | A1 |
20020120933 | Knudson et al. | Aug 2002 | A1 |
20020131511 | Zenoni | Sep 2002 | A1 |
20020162117 | Pearson | Oct 2002 | A1 |
20020188628 | Cooper | Dec 2002 | A1 |
20030001880 | Holtz et al. | Jan 2003 | A1 |
20030012548 | Levy | Jan 2003 | A1 |
20030033157 | Dempski et al. | Feb 2003 | A1 |
20030067554 | Klarfeld et al. | Apr 2003 | A1 |
20030093790 | Logan et al. | May 2003 | A1 |
20030093792 | Labeeb et al. | May 2003 | A1 |
20030101454 | Ozer et al. | May 2003 | A1 |
20030229900 | Reisman | Dec 2003 | A1 |
20040031058 | Reisman | Feb 2004 | A1 |
20040260682 | Herley et al. | Dec 2004 | A1 |
20050060741 | Tsutsui | Mar 2005 | A1 |
20050125428 | Kang et al. | Jun 2005 | A1 |
20050188402 | De et al. | Aug 2005 | A1 |
20050235318 | Grauch et al. | Oct 2005 | A1 |
20060174310 | Lee et al. | Aug 2006 | A1 |
20070061838 | Grubbs | Mar 2007 | A1 |
20070124756 | Covell | May 2007 | A1 |
20080113789 | Canessa et al. | May 2008 | A1 |
20080168133 | Osborne | Jul 2008 | A1 |
20080195746 | Bowra et al. | Aug 2008 | A1 |
20080249853 | Dekel et al. | Oct 2008 | A1 |
20080301727 | Cristofalo et al. | Dec 2008 | A1 |
20090041418 | Candelore | Feb 2009 | A1 |
20090125812 | Blinnikka et al. | May 2009 | A1 |
20090132371 | Strietzel et al. | May 2009 | A1 |
20090150947 | Soderstrom | Jun 2009 | A1 |
20090171948 | Solomon et al. | Jul 2009 | A1 |
20090204481 | Navar et al. | Aug 2009 | A1 |
20090216745 | Allard | Aug 2009 | A1 |
20090226152 | Hanes | Sep 2009 | A1 |
20090235298 | Carlberg et al. | Sep 2009 | A1 |
20090245058 | Goto et al. | Oct 2009 | A1 |
20090276821 | Amento et al. | Nov 2009 | A1 |
20090285550 | Yamada | Nov 2009 | A1 |
20090307227 | Prestenback et al. | Dec 2009 | A1 |
20100031299 | Harrang et al. | Feb 2010 | A1 |
20100064311 | Cooper | Mar 2010 | A1 |
20100083306 | Dempski et al. | Apr 2010 | A1 |
20100088716 | Ellanti et al. | Apr 2010 | A1 |
20100106798 | Barreto et al. | Apr 2010 | A1 |
20100135637 | McDermott et al. | Jun 2010 | A1 |
20100142915 | McDermott et al. | Jun 2010 | A1 |
20100146542 | Weihs et al. | Jun 2010 | A1 |
20100158099 | Kalva et al. | Jun 2010 | A1 |
20100161825 | Ronca et al. | Jun 2010 | A1 |
20100169460 | Angell et al. | Jul 2010 | A1 |
20100169910 | Collins et al. | Jul 2010 | A1 |
20100205049 | Long | Aug 2010 | A1 |
20100235472 | Sood et al. | Sep 2010 | A1 |
20100241962 | Peterson et al. | Sep 2010 | A1 |
20100246666 | Miazzo et al. | Sep 2010 | A1 |
20100299687 | Bertino-Clarke | Nov 2010 | A1 |
20110058675 | Brueck | Mar 2011 | A1 |
20110099225 | Osborne | Apr 2011 | A1 |
20110246885 | Pantos et al. | Oct 2011 | A1 |
20110258529 | Doig et al. | Oct 2011 | A1 |
20110258545 | Hunter | Oct 2011 | A1 |
20110276372 | Spivack et al. | Nov 2011 | A1 |
20120110627 | Reitmeier et al. | May 2012 | A1 |
20120116883 | Asam et al. | May 2012 | A1 |
20120144417 | Khader et al. | Jun 2012 | A1 |
20120167146 | Incorvia | Jun 2012 | A1 |
20120192227 | Fleischman et al. | Jul 2012 | A1 |
20120197419 | Dhruv et al. | Aug 2012 | A1 |
20120233646 | Coniglio et al. | Sep 2012 | A1 |
20130024906 | Carney et al. | Jan 2013 | A9 |
20130070152 | Berkowitz et al. | Mar 2013 | A1 |
20130071090 | Berkowitz et al. | Mar 2013 | A1 |
20130170813 | Woods et al. | Jul 2013 | A1 |
20130176493 | Khosla | Jul 2013 | A1 |
20130191745 | Vella | Jul 2013 | A1 |
20130198642 | Carney et al. | Aug 2013 | A1 |
20130254340 | Lang et al. | Sep 2013 | A1 |
20130262997 | Markworth et al. | Oct 2013 | A1 |
20130332839 | Frazier et al. | Dec 2013 | A1 |
20130347018 | Limp et al. | Dec 2013 | A1 |
20140009680 | Moon et al. | Jan 2014 | A1 |
20140089967 | Mandalia et al. | Mar 2014 | A1 |
20140149918 | Asokan et al. | May 2014 | A1 |
20140157307 | Cox | Jun 2014 | A1 |
20140327677 | Walker | Nov 2014 | A1 |
20140337127 | Morel | Nov 2014 | A1 |
20140365302 | Walker | Dec 2014 | A1 |
20150019644 | Walker | Jan 2015 | A1 |
20150020096 | Walker | Jan 2015 | A1 |
20150237389 | Grouf et al. | Aug 2015 | A1 |
20160191957 | Teixeira et al. | Jun 2016 | A1 |
20170041644 | Dalrymple | Feb 2017 | A1 |
20170041648 | Dalrymple | Feb 2017 | A1 |
20170041649 | Dalrymple | Feb 2017 | A1 |
20170339462 | Clarke | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
2973717 | Jan 2018 | CA |
3270600 | Jan 2018 | EP |
WO 2012094432 | Jul 2012 | WO |
WO 2012094539 | Jul 2012 | WO |
WO 2012174301 | Dec 2012 | WO |
Entry |
---|
US Patent Application filed Jan. 10, 2012, entitled “Interface for Displaying Supplemental Dynamic Timeline Content”, U.S. Appl. No. 61/631,814. |
US Patent Application filed Jul. 25, 2016, entitled “System to Select Supplemental Content for Playback Devices”, U.S. Appl. No. 62/366,540. |
US Patent Application filed Jul. 14, 2016, entitled “Metadata Delivery System for Rendering Supplementary Content”, U.S. Appl. No. 62/362,587. |
US Patent Application filed Jun. 14, 2011, entitled “System and Method for Presenting Content with Time Based Metadata”, U.S. Appl. No. 61/497,023. |
US Patent Application filed Oct. 22, 2016, entitled “Supplemental Content Playback System”, U.S. Appl. No. 15/331,815. |
Number | Date | Country | |
---|---|---|---|
20170339462 A1 | Nov 2017 | US |
Number | Date | Country | |
---|---|---|---|
61497023 | Jun 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13523829 | Jun 2012 | US |
Child | 15670883 | US |