Television viewing is no longer the static, isolated, passive pastime that it used to be. Today, viewers have the option of using a computing device, such as a tablet computer, to view a webpage related to a show they are watching, thereby keeping the viewers engaged in a particular program. However, there are many other webpages, and there is a demand for new and interesting ways to keep the viewer engaged with the webpage that is related to the particular program.
Some or all of the various features described herein may facilitate discovery, organization, and presentation of supplemental content (e.g., second screen content, or same device companion content) on a second user device (e.g., a second screen device such as a tablet computer, smartphone, laptop, etc.) or a first user device (e.g., a first screen device such as a television or video display) to complement primary content displayed on a first user device thereby providing a desirable second screen, or augmented first screen, experience.
In accordance with aspects of the disclosure, an item detection system is provided for supplying appropriate items, such as computer applications, Internet pages, and other interactive content, based on context information regarding a user's current activity. The detected items may be supplied to various user devices for presentation in a variety of screens. For example, the items may be presented in an interface, e.g., a program guide, or other screens accessible through a first screen device, such as a television, or second screen device, such as a tablet. The item detection system, therefore, may provide a means through which users may discover items related to content they are consuming. Additional features of the item detection system with respect to how context information is obtained, how items are detected, how detected items are arranged, how detected items are presented, and how detected items may be downloaded and/or launched are also taught in this disclosure.
Further, other aspects of the disclosure include a supplemental content presentation application and a system for supporting said application. In an illustrative embodiment, this application may include a timeline of events relating to a program, such as a video program. The system may provide this timeline to said application running on a user device, such as a tablet computer, which may present the timeline on a screen for a user to view. The timeline may be utilized to synchronize supplemental content with primary content so that, as the primary content is presented to the user on the same user device or a different one, corresponding supplemental content may be presented on the user device. Users may also interact with the timeline to select points along the timeline, which are associated with portions of the primary content, and access supplemental content corresponding to those points.
The system may receive and/or provide updates to the timeline from an administrative entity and may generate instructions, including supplemental content, that cause the user device to modify the timeline to present the supplemental content at a point along the timeline. Modifying the timeline may include adding a marker on the timeline. Users may also modify the timeline by performing various actions that may cause other markers to be added to the timeline. Further, the system may receive data feeds from social network services and other news sources. The data feeds may include messages that the system may deliver to a user device. A user of the user device may select one of the messages, thereby causing a marker to be added to the timeline. The selection may also create a report that is sent to the system. Based on a number of reports, the system may determine whether a marker should be added to the timeline so that a message may be featured for other users to consume.
Additionally, aspects of the present disclosure teach computing devices, having a processor and memory storing computer-executable instructions, and other apparatuses to perform the above steps and other steps for discovering items and improving a second screen experience.
Other details and features will also be described in the sections that follow. This summary is not intended to identify critical or essential features of the inventions claimed herein, but instead merely summarizes certain features and variations thereof.
Some features herein are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements.
In the following description of various illustrative embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown, by way of illustration, various embodiments in which aspects of the disclosure may be practiced. It is to be understood that other embodiments may be utilized, and structural and functional modifications may be made, without departing from the scope of the present disclosure.
By way of introduction, the various features described herein may allow a user to discover an item, such as a supplemental content presentation application, and download that item to a second screen device (e.g., a tablet), or interact with that content on a first screen (e.g. Television or other device presenting audio or video content). If the supplemental content presentation application is downloaded, the second screen device may present supplemental content to a user while the user is consuming primary content on a first screen device (e.g., a television). If the supplemental content presentation application is used on the first screen device then the supplemental content may be presented to a user while the user is consuming primary content on a first screen device in one of many modes, in which, e.g., interactive content overlays video content, or interactive content is presented beside or around video content. A companion content experience (also referred to as a second screen experience), in which supplemental content may be presented on a first screen device or second screen device, may be enhanced by various features of the supplemental content presentation application, such as a timeline that users may interact with and modify.
There may be one link 101 originating from the local office 103, and it may be split a number of times to distribute the signal to various premises 102 in the vicinity (which may be many miles) of the local office 103. The links 101 may include components not illustrated, such as splitters, filters, amplifiers, etc. to help convey the signal clearly, but in general each split introduces a bit of signal degradation. Portions of the links 101 may also be implemented with fiber-optic cable, while other portions may be implemented with coaxial cable, other lines, or wireless communication paths.
The local office 103 may include an interface, such as a termination system (TS) 104. More specifically, the interface 104 may be a cable modem termination system (CMTS), which may be a computing device configured to manage communications between devices on the network of links 101 and backend devices such as servers 105-107 (to be discussed further below). The interface 104 may be as specified in a standard, such as the Data Over Cable Service Interface Specification (DOCSIS) standard, published by Cable Television Laboratories, Inc. (a.k.a. CableLabs), or it may be a similar or modified device instead. The interface 104 may be configured to place data on one or more downstream frequencies to be received by modems at the various premises 102, and to receive upstream communications from those modems on one or more upstream frequencies.
The local office 103 may also include one or more network interfaces 108, which can permit the local office 103 to communicate with various other external networks 109. These networks 109 may include, for example, networks of Internet devices, telephone networks, cellular telephone networks, fiber optic networks, local wireless networks (e.g., WiMAX), satellite networks, and any other desired network, and the network interface 108 may include the corresponding circuitry needed to communicate on the external networks 109, and to other devices on the network such as a cellular telephone network and its corresponding cell phones.
As noted above, the local office 103 may include a variety of servers 105-107 that may be configured to perform various functions. For example, the local office 103 may include a push notification server 105. The push notification server 105 may generate push notifications to deliver data and/or commands to the various premises 102 in the network (or more specifically, to the devices in the premises 102 that are configured to detect such notifications). The local office 103 may also include a content server 106. The content server 106 may be one or more computing devices that are configured to provide content to users at their premises. This content may be, for example, video on demand movies, television programs, songs, text listings, etc. The content server 106 may include software to validate user identities and entitlements, to locate and retrieve requested content, to encrypt the content, and to initiate delivery (e.g., streaming) of the content to the requesting user(s) and/or device(s).
The local office 103 may also include one or more application servers 107. An application server 107 may be a computing device configured to offer any desired service, and may run various languages and operating systems (e.g., servlets and JSP pages running on Tomcat/MySQL, OSX, BSD, Ubuntu, Redhat, HTML5, JavaScript, AJAX and COMET). For example, an application server may be responsible for collecting television program listings information and generating a data download for electronic program guide listings. Another application server may be responsible for monitoring user viewing habits and collecting that information for use in selecting advertisements. Yet another application server may be responsible for formatting and inserting advertisements in a video stream being transmitted to the premises 102. Although shown separately, one of ordinary skill in the art will appreciate that the push server 105, content server 106, and application server 107 may be combined. Further, here the push server 105, content server 106, and application server 107 are shown generally, and it will be understood that they may each contain memory storing computer executable instructions to cause a processor to perform steps described herein and/or memory for storing data, such as information for identifying a user or second screen device.
An example premises 102a, such as a home, may include an interface 120. The interface 120 can include any communication circuitry needed to allow a device to communicate on one or more links 101 with other devices in the network. For example, the interface 120 may include a modem 110, which may include transmitters and receivers used to communicate on the links 101 and with the local office 103. The modem 110 may be, for example, a coaxial cable modem (for coaxial cable lines 101), a fiber interface node (for fiber optic lines 101), twisted-pair telephone modem, cellular telephone transceiver, satellite transceiver, local wi-fi router or access point, or any other desired modem device. Also, although only one modem is shown in
The
In some embodiments, the supplemental content delivery manager 201a may be implemented as an application specific integrated circuit (ASIC). That is, the supplemental content delivery manager 201a may be a chip designed specifically for performing the various processes described herein. Further, the ASIC may be implemented within or in communication with various computing devices provided herein.
One or more aspects of the disclosure may be embodied in computer-usable data and/or computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other data processing device. The computer executable instructions may be stored on one or more computer readable media such as a hard disk, optical disk, removable storage media, solid state memory, RAM, etc. As will be appreciated by one of skill in the art, the functionality of the program modules may be combined or distributed as desired in various embodiments. In addition, the functionality may be embodied in whole or in part in firmware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA), and the like. Particular data structures may be used to more effectively implement one or more aspects of the disclosure, and such data structures are contemplated within the scope of computer executable instructions and computer-usable data described herein.
Also, while consuming primary content, each user may operate a respective second screen device 302 to consume supplemental content (e.g., second screen content) related to the primary content presented on the first screen device 301 at their premises 102. For example, user A may operate a second screen device 302, such as a smartphone, to consume second screen content, such as a poll through which user A may vote for a contestant shown in the primary content presented on the first screen device 301. The second screen content may be any data that provides information or content to supplement primary content, which may be the video content (e.g., linear television program, on-demand movie, etc.) presented on a first screen device 301. For example, second screen content may include a link to a webpage of a product shown in an advertisement of the primary content, a video clip with bonus features, text and/or images with information about the content itself or about individuals or items shown in the primary content, advertisements, coupons, questions pertaining to the primary content, etc. In some embodiments, the various second screen content may be generated from ordinary everyday consumers of the primary content, such as viewer reviews of a video program, chat room discussions, of a movie, etc. In some embodiments, the second screen content may be from formal primary content sources, such as the same source that provided the primary content (e.g., a television company may provide both a television program as primary content, and a companion Internet page secondary content to accompany the display of the primary content). The appearance of the second screen content may be generated by the second screen device 302 using software that is previously stored, or it may be dynamically retrieved or received when it is desired, and the timing of when the second screen content appears (e.g., when a particular Internet link should appear, or when a particular image should be displayed) may be based on triggers (e.g., Enhanced Binary Interchange Format (EBIF) triggers) or signals that are received along with, or in addition to, the primary content stream. Triggers may also be generated by other methods such as, but not limited to, (1) by analyzing audio and or video signals to determine a position in a program (e.g., automated content recognition), or (2) by explicitly accessing the media time of a video asset. In both of these additional cases, the time within a program can be used to compare against a list of triggers for a program in order to identify an appropriate trigger. In any event, EBIF and/or time-based trigger files may be combined with contextual information to launch, or offer for launch, supplemental content.
Referring to
Further, each of the second screen devices 302 may be configured to bi-directionally communicate via a wired and/or wireless connection with the second screen experience manager 340 via the network 330. Specifically, the second screen devices 302 may be configured to access the network 330 (e.g., the Internet) to obtain second screen content and to transmit/receive signals via the network 330 to/from the second screen experience manager 340. For example, a second screen device 302 may transmit information, such as requests for second screen content, through a wired connection, including the links 101 through which the primary content is supplied to a first screen device 301, to the local office 103 which then routes the transmission to the network 330 so that it may eventually reach the second screen experience manager 340. That is, the second screen device 302 may connect to the interface 120 and communicate with the second screen experience manager 340 over the links 101 used to transmit the primary content downstream. Alternatively, a second screen device 302 may wirelessly communicate via, for example, a WiFi connection and/or cellular backhaul, to connect to the network 330 (e.g., the Internet) and ultimately to the second screen experience manager 340. Accordingly, although not shown, the network 330 may include cell towers and/or wireless routers for communicating with the second screen devices 302.
Although
Still referring to
When an option is selected, content information 403 related to the content associated with that option may be displayed in a preview frame 402. In the case of
Additionally, the program guide 400 may include a discovery frame 405. The discovery frame 405 may include other items 406 related to the selected content. Examples of related items 406 may include media (e.g., audio, video, audio/video content), links (e.g., URLs), applications (or computer programs), advertisements, and the like. As shown in
While a particular piece of content is selected, e.g., “The Office,” a user may choose one of the related items 406 in the discovery frame 405. The items 406 in the discovery frame 405 may automatically change as a user navigates to other options 401 within the program guide 400 so that the items 406 in the discovery frame 405 correspond to the selected option 401. That is, the discovery frame 405 may be populated and continuously, or intermittently, updated with items 406 identified by an item detection system described in more detail below. In this manner, a user may discover one or more items 406 available for a particular piece of content.
Moreover, a user may navigate the discovery frame 405 to select the items 406 shown. While an item 406 is selected, a user may choose to download the selected item 406 (e.g., receive and store an executable file of the selected item 406). The item 406 may be downloaded to an interface 120 (e.g., a set top box), a media center coupled to the interface 120, a first screen device 301, or a second screen device 302. In some examples, the item 406 may first be downloaded to an interface 120 and then forwarded to the second screen device 302 of the user who made the selection to download and/or to other second screen devices 302 in communication with the interface 120.
In some cases, one or more items 406 may have already been downloaded to, e.g., an interface 120 or media center coupled thereto. In such cases, while an item 406 that has already been downloaded and cached is selected, a user may choose to launch the item 406 on a first or second screen device 301, 302. Accordingly, when an item 406 is first selected, a local computing device (e.g., the interface 120) may check to determine whether the item 406 has been downloaded already before requesting it to be downloaded. To support such an embodiment, the local computing device may include a local cache to store a list of already downloaded items and/or the items themselves. Also, where the item is a link or HTML element, selecting the item may trigger a first screen device 301 or second screen device 302 to launch a browser to view content associated with the link or HTML element.
In addition, the discovery frame 405 may include a search box (not shown) in which a user may insert keywords to find related items 406. Such a search box may be used to refine the search results in the discovery frame 405 or to conduct a new search for all available items 406. However, in some examples such a search box might not exist, and thus, a user may discover related items without searching but by navigating the program guide 400 instead. Further, although
The detailed information frame 412 may also provide item description information 413. The item description information 413 may include a description of the item explaining features of the item, what platform the item is compatible with, how much memory is needed to download the item, how long it will take to download the item, how much the item costs, etc. The item description information 413 may also include a rating of one or more of the items and/or a preview of one or more of the items (e.g., a video clip showing features of the item). Further, the item description information 413 may also include a link to a page (e.g., a webpage) including more information if the item description information 413 does not fit within the detailed information frame 412 (see the “more info” link in
In the example of
Instead of performing a search as described above, in some cases programs may be associated with a particular genre (e.g., science fiction, comedy, sports, movies, etc.), and a user may select a particular genre to view items related to that genre. Notably, the programs returned as a result of the search or selecting a genre may include linear content as well as non-linear content (e.g., content previously broadcasted or to be broadcasted in the future, on-demand content, etc.). Thus, one of skill in the art should appreciate that items related to non-linear content may also be discoverable using a system in accordance with this disclosure.
Using the triggers 450, audio signals 460, and/or an explicit time code, the second screen device 302 may determine which portion (or segment) of the content is being presented. Based on this information, the second screen device 302, and in particular, the “Watch With Me App” running on the second screen device 302, may display corresponding supplemental content. That is, the supplemental content presented on the second screen device 302 may be synchronized with the content presented on the first screen device 301. Furthermore, the second screen device 302 may present related items in a discovery frame 470 that are also synchronized with the supplemental content and/or primary content. For example, as shown in
Also, although the invitation 490 is shown in
Notably, the webpage 500 may include a discovery frame 501. The discovery frame 501 may be generated by the web browser to display items that are related to the webpage 500. If it is determined that there are such related items, then the related items may be presented in the discovery frame 501. One manner for determining which items, if any, are related to the webpage 500 may be to analyze the URL of the webpage, and compare it to a list of available items. Specifically, an item detection system (discussed in further detail below) may detect items that are related to the webpage 500 based on the URL of the webpage 500 and its contents, and the detected items may be presented in the discovery frame 501 on the webpage 500. For example, if the webpage 500 is a webpage for a television program, such as “The Office,” then an item detection system may detect items related to “The Office,” such as the “Silly Jokes App” (mentioned above with respect to
Additionally, or alternatively, items may be related to the webpage 500 if they are associated with the owner or host of the webpage 500. Therefore, the item detection system may also analyze information on the owner or host of a webpage 500 to determine which items to present in the discovery frame 501. Yet another manner to detect related items of the webpage 500 may include analyzing the primary content presented in the main portion 502 of the webpage 500 or in another webpage from the same website as the webpage 500. For example, referring to
In some examples, the selection and sequencing of items presented in the discovery frame 501 may be dynamically modified as the user moves a cursor 503 on the webpage. For example, if the cursor 503 is over or near the story about the actress going to dinner at the new restaurant, then the discovery frame 501 might only present items related to the new restaurant or restaurants in general. Alternatively, the location of the cursor 503 may influence the order in which items appear in the discovery frame 501. For example, referring to
In order to associate a portion of a webpage 500 with related items, the webpage 500 may contain elements (e.g., HTML elements) that have specific context information or explicit item associations. As a result, when a cursor is moved within some proximity to the element or the element is clicked, the element may invoke processing to determine related items for the discovery frame 501. In some embodiments, the webpage 500 itself (without additional elements for context information or explicit item associations) can determine changes in context based on cursor location (e.g., mouseover events) or click events, and then call the second screen experience manager 340 to determine the appropriate list of items for the discovery frame 501. Also, the webpage 500 may collect (e.g., at load time or prior thereto) all items based on the context information of the webpage 500 so that relatively rapid adjustment of the order of the items in the discovery frame 501 may be achieved.
Referring to
After the search button 502 is selected, a new webpage 530 may be displayed. The new webpage 530 may display detected items based on the search results. For example, referring to
The item database 601 may store item data, including one or more items (e.g., applications, audio/video content, links, etc.) and item information (e.g., context information, an item ID, author information, platform information, rating information, etc.) in association with each item. Item information may be used to determine which items are of interest. For example, item information may be searched to identify one or more items, and those items may be displayed in a discovery frame 405 of the program guide 400. Table 1 below shows a graphical representation of example item data stored in the item database 601.
Table 1 includes items corresponding to the items shown in
It should be understood that Table 1 illustrates example relationships of various information stored in memory, and that these relationships could also be illustrated with more than one table. For example, there could be three tables where a first table includes metadata in association with items, a second table includes identifiers in association with content, and a third table includes links between the first table and second table. In some examples, storage capacity may be conserved with a three table arrangement as one entry in the first table can link to multiple entries in the second table and one entry in the second table can link to multiple entries in the first table. This arrangement may be particularly desirable where there are similar items for multiple platforms. It should also be understood that Table 1 above illustrates only some of the types of item information that may be stored in the database 601. In other embodiments, more or less item information may be stored. For example, an additional column may include an Entertainment Identifier Registry (EIDR) identifier associated with each piece of content. Also, each item does not have to include each type of item information. Some items may have data for each type of item information while some items may have data for just some of the types of item information. This is illustrated in Table 1 by the empty cell in the “Keywords” column for the “Bonus Clip!” item.
The database 601, regardless of its contents, may be configured to connect to the query engine 602 of the item detection system 600. The query engine 602 may be a module of code, including computer-executable instructions, executed by a processor of a computing device 200. The query engine 602 may be configured to receive context information from a user device 610, which may be a first screen device 301, second screen device 302, interface 120, or any other computing device 200. The context information may include any type of data that can be used to search the database 101. In other words, context information may include any type of data that associates an item to content. For example, the context information may be an identifier that identifies a selected content or portion thereof. Specifically, the context information may include the name of a television program selected using a program guide 400. Another example of context information may be the name (e.g., NBC) or number (e.g., channel 4) associated with a logical channel number or service identifier that a user is scrolling over within the program guide 400. Still another example of context information may include keywords (e.g., “video games,” “football,” “trivia,” etc.), a URL (e.g., “http://www.youtube.com”), or the name of a book or other publication (including those available in digital form) entered into user input fields 501 of a webpage 500. In some examples, context information may also include user preferences associated with a user of the user device 610 or the user device 610 itself. Additionally, or alternatively, in some cases, the context information may include information about the user device 610, such as what operating system the user device 610 is running, what type of display the user device 610 is coupled to, what type of video card the user device 610 includes, or any other information pertaining to the software and/or hardware of the user device 610. The following provides a non-exhaustive list of examples of context information: a content title (e.g., television program title, electronic book title, etc.); content description; content format (e.g., image, video, etc.); content file-type (e.g., .avi, .divx, .mov, etc.); genre (sports, comedy, romance, reality television, etc.); content provider; keywords or search string; location in program guide 400 or other screens; location in webpage 500; information regarding other items; program ID; episode ID; series ID; actor/actress names; producer/director names; paid associations (sponsorships); item ID (where item owner/creator explicitly designates an item as corresponding to content); program closed captioning feed; video data of content; audio data of content; similar asset consumption data (those who liked this content consumed this application); time of day; demographics of user; etc.
Using the context information, the query engine 602 may search the database 601. The query engine 602 may prompt a user for additional information if no search results are found. If search results are found, the query engine 602 may return the results to the user device 610. Also, if there are search results, the search results may include one or more items that were identified as matching the context information. For example, a plurality of items may be returned in response to a search using the contextual information even where the contextual information only includes a single piece of information, such as a television program name. In some embodiments, the number of items of the search results may vary depending on the user device 610 to which the search results are transmitted. For example, referring to
As shown in
Further, the mapping engine 604 may be configured to provide an interface for entities that wish to register items with the database 601. The mapping engine 604 may be a module of code, including computer-executable instructions, executed by a processor of a computing device 200. As shown in
The mapping engine 604 may determine whether an item is accepted for registration with the database 601. In some examples, an item may be accepted if it meets certain criteria, such as it is an acceptable type of file, does not exceed a certain file size, and/or terms and conditions have been agreed to by the creator of the item. When the mapping engine 604 determines that an item is accepted, the mapping engine 604 may cause the item and its associated item information to be stored in the database 601. The mapping engine 604 may also allow the item and associated item information to be subsequently edited. Additionally, the mapping engine 604 may assign priorities and/or ratings to the items entered into the database 601. The priorities and ratings of the items may be based on a subscription tier of the entity providing the item. For example, if the provider of an item is a premium-level customer, the item may be given a higher priority and/or rating so that the item is more likely to be included in search results obtained by the query engine 602. The level/tier of an entity may be based on popularity, trustworthiness, etc. of the entity and/or fees collected from the entity or from end users of the items. Therefore, the mapping engine 604 may be configured to determine the level/tier of various entities based on these characteristics.
To obtain information for making determinations based on such characteristics, the mapping engine 604 may use a settlement engine 605. The settlement engine 605 may be a module of code, including computer-executable instructions, executed by a processor of a computing device 200. The settlement engine 605 may track which items are showing up in search results of the query engine 602 and/or which items are being transferred from the query engine 602 to end users. The settlement engine 605 may also be configured to track placement of items and fees collected from the entity and/or end users of the items. For example, the settlement engine 605 may determine that an entity, which created/supplied a particular item, owes a certain amount of money based on placement of the particular item in a program guide 400. The settlement engine 605 may determine the identity of the entity with the assistance of the mapping engine 604, which may determine which entities are associated with which items.
In step 701, context information may be obtained to determine or describe criteria that will be used in a search to find related items that may be of interest to the user. The manner in which the context information is obtained may vary according to the particular embodiment. For example, referring to
Once context information is obtained, it may be transmitted to and received by the item detection system 600 at step 702. For example, the context information may be received by the query engine 602 of the item detection system. The item detection system 600 may receive the context information via various connections, including the links 101 and network 330. For example, where the item detection system 600 is located at the local office 103, the context information may be transmitted in the upstream bandwidth of the links 101 from the interface 120 through the TS 104 to the item detection system 600. In another example, where the item detection system 600 is located within or associated with the second screen experience manager 340, the context information may be transmitted wirelessly from a second screen device 302 to the second screen experience manager 340, for example, through the network 330.
In step 703, the received context information is used to detect items related to the content from which the context information is obtained. This detection may comprise searching a database 601 using the context information. When detecting related items, the item detection system 600 may be configured to search the database 601 until a certain number of related items are detected or to search the entire database 601 and choose a certain number of items from among all the items identified as having some relevance, e.g., a predetermined level of relevance. In some examples, an item might only be detected as a related item if it meets two or more criteria. For example, an item might only be detected or identified if it is operable on a particular platform and has a keyword matching one of the corresponding criteria in the context information. Various embodiments may use different algorithms for performing the search and detecting items at step 703.
In step 704, the item detection system 600 may rank or arrange items in order based on various factors. One such factor may be a rating, which may be determined based on user feedback. For example, users may rate a particular item as deserving 4 out of 5 stars, and thus, the particular item may have a 4.0 rating. Another factor may be a priority associated with the item, which may be determined based on a status (e.g., premium status) of an entity that created or provided the item. For example, some entities may subscribe for premium status so that the items they submit have a higher priority, and therefore, may have a higher likelihood of being distributed. Yet another factor may be a degree of relevance, which may be a score representing how similar an item is to the received context information. Demographic and geographical information associated with the context information as well as time of day information of when the context information is provided to the item detection system 600 may also be factored into the arrangement. The item detection system 600 may determine the demographic and geographical information based on the user device 610 providing the context information and/or based on consumption data of a user using the user device 610. Still another factor may be user preferences, which may be provided by a user. Weights may be assigned to one or more of the factors so that a final order may be determined.
In step 705, the detected items, which may be arranged in step 704, may be delivered to the user device 610 that supplied the context information used to detect the items. For example, the detected items may be returned to an interface 120 that supplied the context information. Alternatively, the detected items may be delivered to another computing device 200 associated with the user device 610 that supplied the context information. For example, a second screen device 302 may extract context information from an interface such as a program guide 400 and send the context information to the item detection system 600, which may deliver detected items to a first screen device 301 associated with the second screen device 302.
The user device 610 that receives the detected items may then present the items at step 706. The items may be presented within a designated portion of a screen of the user device 610. For example, referring to
The item detection system 600 discussed above may assist users in discovering a number of applications, such as a supplemental content presentation application (also referred to herein as the “Watch With Me” application), that are related to a primary piece of content that the user is consuming. Below, further details describing the “Watch With Me” application are disclosed. The “Watch With Me” application may include computer-executable instructions that, when executed by at least one processor 201 of a computing device 200 (e.g., a second screen device 302), may cause the computing device 200 to render a second screen for presenting supplemental content related to primary content that is shown on a first screen device 301. The “Watch With Me” application may be launched/downloaded by any of the methods disclosed herein, such as by selecting the application from a program guide 400. The “Watch With Me” application may also be launched from within another application, and thus, the “Watch With Me” application could refer to one or more modules of code or scripts (e.g., JavaScripts) within a larger application. For example, the “Watch With Me” application could be launched when a user uses another application on the second screen device 302 to tune the first screen device 301 to a channel. At that time, the other application may determine whether supplemental content for the primary content on the tuned to channel is available through the “Watch With Me” application. If so, the other application may launch the “Watch With Me” application so that the supplemental content for the tuned to channel is presented on the second screen device 302.
The second screen 800 may also include a timeline 801. The timeline 801 shown on the second screen 800 may correspond to the piece of primary content (e.g., a television show, sporting event, etc.) being consumed on the first screen device 301. In the example of
The timeline 801 may provide a graphical, linear representation of events and the chronological order in which they occur within the corresponding piece of content. Herein, the timeline 801 may refer to the graphical representation and/or the data (or computer-executable instructions) used to render the timeline 801. Further, the timeline 801 may demonstrate the relationship between a point in time of the primary content and the supplemental content presented on the second screen 800. Referring to
In some examples, the supplemental content presented on the second screen 800 may be approximately synchronized with the primary content being presented on a first screen device 301 so that the supplemental content may change along with the primary content. In such cases, as the primary content progresses (e.g., as “The Voice” continues), the shaded portion 802 may extend in length to illustrate the time point of the show. For example, referring to
Notably, a user may interact with the timeline 801 to select different points in time along the timeline 801 thereby causing different portions of the supplemental content to be presented on the second screen 800. In other words, the timeline 801 does not have to stay synchronized with the primary content. Referring to
Additionally, the timeline 801 may include markers 803 configured to mark certain points along the timeline 801. When a marker 803 is active (e.g., when the playback point in the primary content has reached the marker point in the timeline), it may cause a particular piece of supplemental content to be presented on the second screen. Referring to
In
The timeline 801 of
The first markers 803a may be pre-set and/or added in real-time. Where the first markers 803a are pre-set, the timeline 801 may include the first markers 803 when it is initially rendered by the second screen device 302. A provider of the primary content may know that a particular event will take place in the primary content at a particular point in time, and therefore, may create a marker 803 corresponding to that particular point in time that will trigger a specific portion of supplemental content to be shown in the second screen 800. For example, a provider of primary content may know that an actor will be driving a particular car at a certain time during a television program, and therefore, may create a marker 803 on the timeline 801 that causes a webpage, video, etc. that is related to the car to be presented on the second screen 800.
As mentioned above, the first markers 803a may also be added in real-time. An administrative entity may be designated to monitor live content to identify interesting events. The administrative entity may determine that a particular event has occurred within a piece of primary content, and may desire to supply additional supplemental content for that event. If so, the administrative entity may send a signal (or cause a signal to be sent) to a second screen device 302 presenting supplemental content for that primary content so that a first marker 803a may be added to the corresponding timeline 801. The administrative entity may also maintain a copy of the first markers 803a in its own memory, as well as information for the supplemental content (e.g., the application and/or image files for an interactive application that is to appear at the three minutes and ten seconds (3:10) mark in a television program) and information identifying how/where the supporting files for the supplemental content may be retrieved, for presentation to users who request the same primary content in the future. As a result, when the added first marker 803a is selected, the second screen device 302 may be controlled to present a particular piece of supplemental content.
The second markers 803b may also be added by a user of the second screen device 302. When a user determines that an event in the primary content and/or the corresponding supplemental content is interesting, he/she may wish to mark that event. To accomplish this, a user may click/press on a part of the timeline 801 or a designated mark key 804, and as a result a second marker 803b may appear on the timeline 801 at a position corresponding to the current point in time in the primary content on the first screen device 301. In this manner, a user may be able to identify supplemental content for later consumption. Users may find it desirable to mark supplemental content, for example, when the user is busy consuming the primary content and does not want to be distracted for the time being, but wants to take a look at the supplemental content in more detail later. Also, the second screen device 302 may be used to replay part, or all, of the primary content corresponding to any of the pieces of supplemental content, and the user may mark the timeline to indicate points in time of the primary content that he/she would like to view again at a later time.
In addition, second markers 803b may also be generated automatically in response to an action of the user. For example, a user may react to certain primary content or supplemental content by selecting an option to share the comment with one or more friends in a social network, and as a result, a second marker 803b may be added to the timeline 801 at a time corresponding to when the user shares the comment. For example, a user may select a share option 805 on the second screen 800 in response to reading the Cee-Lo quote shown in
Referring to
If the user enters and shares data, a second marker 803b may be added to the timeline 801 at a point in time when the user selected the share option 805. Notably, a user may enter data at multiple times thereby creating a plurality of second markers 803b along the timeline 801. As a result, the user may be able to subsequently review portions of the primary content and/or supplemental content that he/she entered data for (e.g., portions the user commented on). Thus, the timeline 801 may include a log of data entries.
Further, when a user is finished consuming primary and/or supplemental content (whether because the user has viewed all of the content or the user chooses to stop consuming the content), the first markers 803a and/or second markers 803b of the timeline 801 may be stored. The timeline 801 may be stored in a storage area on the second screen device 302 or in a storage area on the network 330 (e.g., in the second screen experience manager 340) to which the second screen device 302 is connected. If a user desires to view the timeline 801 at a later time (e.g., when watching a rerun of the primary content, or when resuming playback of the primary content via a DVR), the user may view the previous data he/she entered by selecting the second markers 803b. Users may also delete second markers 803b that they have created. A user may wish to delete a second marker 803b after he/she has reviewed the supplemental content thereof and does not want to review it again.
As shown in
Further, the messages may be ordered based on a time that they were entered through their respective social network services or based on a time that they are received by the “Watch With Me” application through the data feeds described above. As the messages are placed in order, they may also be assigned to a point on the timeline 801. In other words, the messages may be synchronized with the primary content and/or timeline 801. Thus, by adjusting the shaded portion 802 of the timeline 801, a user may cause the application to pan to a portion of the second screen 1000 showing the corresponding messages. For example, referring to
Another aspect of the disclosure includes adding messages to the timeline 801. A user may want to mark one or more messages so that the user can refer back to the messages at a later time. To make this possible, the “Watch With Me” application may provide a post message option 1001 next to each message thereby allowing the user to select a message of interest to the user. When the user selects the message, a marker 803 may be added to the timeline 801. For example, referring to
Notably, the user may create a second marker 803b for any message on the second screen 1000. For example, referring to
Referring to both
In
Referring to
In some cases, the user may wish to play the desired portion on the first screen device 301 (e.g., a television). If so, the user may select a send-to-tv option 1101 on the second screen 1100. This selection of the send-to-tv option 1101 may cause the second screen device 302 to send a signal, including an indication of the first screen device 301, the user or a user account, the desired content, a time point for playback, and/or a duration for playback, to the DVR or recording server (whichever is set up to record the content and support this functionality). In turn, the signal may cause the DVR or recording server to check whether it has recorded the desired portion, and if so, to jump back to a point in time to play the desired portion. Alternatively, the DVR or recording server that receives this signal may set up a new stream containing the desired portion of primary content, and send a command to a computing device 200, such as the DVR, interface 120, or first screen device 301, to force the computing device to tune to the new service/channel carrying the stream. As a result, the first screen device 301 may re-present the desired portion of the primary content. After the desired portion is re-presented, the computing device 200 may tune back to the service/channel carrying the primary content that was being presented before the signal was received from the second screen device. While the first screen device 301 is re-presenting a desired portion of the primary content, the DVR or recording server may record (or buffer) the primary content being missed so that when the user is tuned back to the service/channel carrying the primary content, the user may consume the content from the point in time where he/she left off before tuning away to consume the re-presented portion.
In some embodiments, the user may wish to play other types of supplemental content (other than re-presentations of the primary content) on the first screen device 301 (e.g., a television). If so, the user may select the send-to-tv option 1101 on the second screen 1100. The “Watch With Me” application may present the send-to-tv option 1101 for certain pieces of supplemental content. When the send-to-tv option 1101 is selected, the “Watch With Me” application may cause the second screen device 302 to communicate with a first screen device 301 or computing device 200 connected thereto (e.g., interface 120). The second screen device 302 may then transmit the supplemental content via a wired or wireless connection (e.g., WiFi) to the first screen device 301 or an interface connected to the first screen device 301 thereby causing the first screen device 301 to play the supplemental content. Once the supplemental content finishes playing, the first screen device 301 may return to the primary content. In some embodiments, the “Watch With Me” application will cause the second screen 1100 to present the primary content (which may include commercials) or a related version of the primary content (e.g., a logo of an advertiser whose commercial would have been shown on the first screen device 301 as part of the primary content if the first screen device 301 were not presenting the supplemental content), while the first screen device 301 is presenting the supplemental content.
The one or more social network services 1201 may include FACEBOOK™, TWITTER™, TUNERFISH™, etc. Each social network service 1201 may include any computing device 200, such as personal computers, laptops, tablets, smartphones, PDAs, servers, etc. Any computing device 200 on which a user is signed-in or logged-in may be considered to belong to the social network service 1201. For example, a user may sign-in to the social network service 1201a using a tablet, and that tablet may be considered to belong to the social network 1201a as long as the user is signed-in on that tablet. Indeed, the second screen device 302 in
The one or more news sources 1202 may include any computing device 200 configured to supply a data feed, such as a rich site summary (RSS) feed. Accordingly, a news source 1202 may include a server that hosts a webpage or website from which a data feed may be received. For example, a news source 1202 may include a website which presents blog posts and broadcasts blog posts through an RSS feed.
The second screen experience manager 340 may be configured to communicate with one or more of the social network services 1201 and news sources 1202 via the network 330 (e.g., the Internet). Specifically, the second screen experience manager 340 may be configured to receive one or more data feeds from each of the one or more social network services 1201 and news sources 1202. Accordingly, the second screen experience manager 340 may include an aggregator, such as an RSS reader, to receive and read the data feeds. One or more uniform resource identifiers (URIs) may be provided to the aggregator to configure the aggregator to subscribe to certain data feeds. These URIs may be provided by an administrative entity or other operator of the second screen experience manager 340 such that the data feeds may be subscribed to whether or not users have requested supplemental data from the data feeds. Alternatively, URIs may be provided in response to a request for supplemental content. For example, referring to
Further, the second screen experience manager 340 may be configured to analyze the data feeds and organize the data. Table 2 below illustrates an example of the various associations that may be created as a result of organizing the data.
The data in Table 2 is provided to illustrate various aspects related to how the second screen experience manager 340 might organize various types of data. Here, organizing may include storing the various types of data in association with each other. The second screen experience manager 340 may include a database, or be coupled to a database, configured to store the various types of data in association with each other. Referring to Table 2, the “Social Network Service” column may include information identifying one of the social network services 1201. This information may be obtained based on information in the data feed indicating the source of the data feed. For example, where the data feed includes IPv4 or IPv6 packets, this information may be determined based on the source address. The “Timestamp” column may include a timestamp indicating a date and time for associated information received by the second screen experience manager 340 through the various data feeds. The timestamp may represent a time the associated information is received, or a time supplied by the social network service 1201 indicating a time that the associated information was generated or transmitted. Notably, the second screen experience manager 340 may be configured to receive information from different data feeds in parallel, and thus, different data may have the same or approximately the same timestamp. The “Message” column includes information entered by a user (e.g., a “tweet” on TWITTER™) or generated in response to a user action (e.g., selecting a “like” key on FACEBOOK™) through one of the social network services 1201, and forwarded to the second screen experience manager 340 through one of the data feeds. Although shown as text in Table 2, the message may include images, audio, video, etc. The “Popularity Score of Message” column may include a score (or other valuation) indicating how popular a particular message might be. The second screen experience manager 340 may analyze the messages to compute this score. Such analysis may include tallying a number of times that a message was retransmitted (e.g., retweeted) or a number of times a message was approved (e.g., liked). This popularity score may be used to determine whether messages are made accessible through the “Watch With Me” application at all or included in the timeline 801 as featured content, as disclosed herein. The “Username” column may include usernames identifying the users of the different social network services 1201 who originally created the message information received. Information regarding the username may be received in association with the respective message information through the data feeds. The “Subject Matter ID” column may include information identifying the subject matter that the message is directed to. In a case that the social network service is FACEBOOK™, the subject matter ID information may indicate a FACEBOOK™ page that was “liked.” In comparison, in a case that the social network service is TWITTER™, the subject matter ID information may indicate a hashtag (e.g., “#TheVoice” hashtag) of the message. The subject matter ID information may also be received in association with the respective message and username through the data feeds. Lastly, the “Primary Content” column includes data identifying related primary content that the second screen experience server 340 may generate based on data in one of the other columns. For example, based on the subject matter ID indicating a FACEBOOK™ page of “The Voice,” the second screen experience server may determine that the message (e.g., liked) is related to the television program called “The Voice.”
In addition to communicating with the social network services 1201 as described above, the second screen experience manager 340 may communicate with the second screen device 302 executing the “Watch With Me” application. When the “Watch With Me” application is launched, it may cause the second screen device 302 to connect to the second screen experience manager 340 via the network 330. The second screen device 302 may send signals to the second screen experience manager 340 indicating which primary content is being consumed and/or requesting specific supplemental content. Further, the second screen experience manager 340 may transmit data, which it receives from the social network services 1201, to the second screen device 302. For example, the second screen experience manager 340 may determine that the second screen device 302 should receive supplemental content related to “The Voice.” Then, the second screen experience manager 340 may perform a search on the information it received from the data feeds and stored by searching for data associated with primary content indicating “The Voice,” and may transmit the search results to the second screen device 302 so that the second screen device 302 may present a screen such as the second screen 1000 of
Notably, the second screen experience manager 340 may continue to send data (e.g., messages) to the second screen device 302 so that the second screen device may present the second screen 1000 with the most recent messages. Meanwhile, the second screen experience manager 340 may receive reports from the second screen device 302. Such reports may indicate selections of desired supplemental content, which the second screen experience manager 340 may provide in response. Additionally, or alternatively, such reports may indicate selections made by the user. For example, the “Watch With Me” application may cause the second screen device 302 to report that a user has selected a particular message (e.g., a user has made selection S1 in
In step 1300, the second screen experience manager 340 may register a second screen device 302 and/or a user associated with the second screen device 302. This registration process may include providing a supplemental content presentation application, such as the “Watch With Me” application, to a second screen device 302. Further, this registration step may include receiving information (e.g., a username, password, device ID, etc.) for setting up a user account. When setting up a user account, a user may specify user preferences that may determine which timelines and/or which supplemental content are subsequently provided to the user. For example, a user may specify his/her age, gender, interests, etc. so that he/she receives a timeline with certain pre-configured markers appropriate for that user. As a result, for example, one user may receive a first timeline with first supplemental content for a particular piece of primary content, while another user may receive a second timeline with second supplemental content for the same piece of primary content.
In addition, while setting up a user account, a user may also provide information related to social network services 1201 of the user for configuring the second screen experience manager 340 to access the social network services 1201. Since the second screen experience manager 340 may access the social network services 1201, a user may submit comments (or other data) to his/her social network services 1201 through the “Watch With Me” application running on a second screen device 302. For example, comments may be sent from a second screen device 302 to the second screen experience manager 340, which may then forward the comments to a server of the social network service 1201 along with information stored in the second screen experience manager 340 for authenticating the user to the particular social network service 1201. Further, setting up a user account may configure the second screen experience manager 340 to store usernames of friends, family, or other persons of interest in association with the user account, so that the second screen experience manager 340 may filter messages and send selected messages to a second screen device 302 associated with the user account.
Once a user account is set up, a second screen device 302 may be configured to interact with the second screen experience manager 340 so that the remaining steps of
In step 1301, the second screen experience manager 340 may receive one or more requests for supplemental content. The requests may be received from one or more first screen devices 301 or from one or more second screen devices 302. Specifically, a second screen device 302 may send a signal requesting a particular piece of supplemental content related to a particular piece of primary content, such as an episode of “The Voice.” In response to receiving the request, the second screen experience manager 340 may store an identifier identifying the second screen device 302 in association with information identifying the requested supplemental content.
In step 1302, the second screen experience manager 340 may receive or access one or more timelines 801. The timelines 801 may be provided by content creators (e.g., television show producers), the local office 103, or other approved entities. In some embodiments, the second screen experience manager 340 may generate one or more timelines 801. For example, the second screen experience manager 340 may use predefined code to set up a timeline 801 for a particular piece of content. More specifically, the second screen experience manager 340 may determine when a timeline 801 is not available for a particular piece of content, and in response, may create a new instance of a module (which may include one or more classes, such as Java classes) of computer-executable instructions designed to present a timeline 801. In some examples, the second screen experience manager 340 may use a content listing (e.g., program guide 400) to determine that a particular piece of content is new, and therefore, that a timeline is not yet available and should be generated.
In addition to creating a new instance of a module for a new timeline 801, the second screen experience manager 340 may also automatically add supplemental content to the timeline 801. In some examples, second screen experience manager 340 may perform audio and/or image recognition processes on the content for which the timeline 801 is created and may add supplemental content according to the results of those processes. For example, audio recognition processes may be performed on the audio data of the content to determine that an actor in the content refers to a brand of clothing. As a result, the second screen experience manager 340 may add a link to a website for that brand of clothing into the timeline 801. Specifically, the second screen experience manager 340 may create a first marker 803a at a point along the timeline 801 representing a time when the actor referred to the brand of clothing so that that first marker 803a, when selected, may cause a link to the website for that brand of clothing to be presented on a second screen device 302. Similarly, image recognition processes may be used to identify images of an item within the video data of the content so that links to websites selling that item may be added to the timeline 801. Additionally, or alternatively, supplemental content may be automatically added to the timeline 801 based on the identity of the content for which the timeline 801 is being created. That is, based on the identity of the content, the second screen experience manager 340 may determine that predetermined supplemental content associated with that content should be added to the timeline 801. For example, where the content is an episode of “TheVoice,” the timeline 801 may add one or more first markers 803a providing additional information about one or more characters (e.g., Cee-Lo) known to be featured in the content. In another example, YouTube™ content on a particular channel known to be associated with the identified content may be incorporated into the timeline 801. In some cases, automatically added first markers 803a may be inserted into the timeline 801 at points corresponding to known commercial breaks within the content so as not to disrupt a user's consumption of the content.
In step 1303, the second screen experience manager 340 may receive one or more data transmissions from social network services 1201 and/or news sources 1202 as described above. As shown in
In step 1304, the data from the data transmissions may be analyzed and organized as described above. Like step 1303, step 1304 may be performed continuously or at intervals to organize the data from the data feeds as it is received.
In step 1305, an appropriate timeline 801 (either received, accessed, or generated) may be transmitted/delivered to the second screen device 302 that sent the request received in step 1301. The appropriate timeline 801 may be determined based on information within the request identifying the content for which supplemental content is desired or identifying the supplemental content itself. Here, delivering a timeline 801 may include sending data for the timeline 801 and indicating that the timeline data is for a timeline 801 so that a supplemental content presentation application (e.g., the “Watch With Me” application) running on a second screen device 302 may configure the timeline 801. In some embodiments, step 1305 may include multicasting one or more timelines 801 to all second screen devices 302 in communication with the second screen experience manager 340 using one or more multicast signals. In such embodiments, each of the second screen devices 302 may determine whether to buffer and/or present the timelines 801. Where timelines 801 are broadcasted, step 1307 might not be performed in response to step 1306, and instead, step 1307 may transmit messages continuously or intermittently.
Delivering an appropriate timeline 801 at step 1305 may include delivering a version of the timeline based on a type of receiving device and/or user preferences. There may be multiple timelines 801 for the same piece of content that are designed for different types of devices. For example, a smartphone may receive one version of the timeline 801 while another type of device (e.g., a tablet) may receive another version of the timeline 801. The version of the timeline 801 delivered to the smartphone might have a smaller layout than the version of the timeline 801 delivered to the other type of device. Alternatively, the version delivered to the smartphone may have a vertical layout as opposed to a horizontal layout that is used in the version sent to the other type of device.
In step 1306, the second screen experience manager 340 may receive a request for messages related to a particular piece of content. For example, referring to
In response to receiving the request at step 1306, the second screen experience manager 340 may deliver the messages at step 1307. Specifically, the second screen experience manager 340 may search a database or other storage area for messages related to the content identified in the request. In cases where the data from the data feeds are analyzed and organized in step 1304, the messages may be more easily found by searching for the messages using information identifying the content. In some embodiments, step 1307 may include broadcasting messages to all second screen devices 302 in communication with the second screen experience manager 340 using one or more multicast signals. In such embodiments, each of the second screen devices 302 may determine whether to buffer and/or present the messages. Where messages are broadcasted, step 1307 might not be performed in response to step 1306, and instead, step 1307 may transmit messages continuously or intermittently.
Further, in some embodiments, step 1303 might not be performed until the request for messages is received at step 1306. That is, the second screen experience manager 340 might wait until it receives a request for messages related to a particular piece of content, and then may subscribe to a data feed that provides messages for that particular piece of content. In such embodiments, the delivering of messages in step 1307 may include subscribing to a particular data feed based on the request, receiving messages from that data feed, and delivering those messages.
In step 1308, the second screen experience manager 340 may receive reports from one or more second screen devices 302. As described above with respect to
In step 1310, the second screen experience manager 340 may acquire updates to the timeline 801. That is, the second screen experience manager 340 may acquire a command indicating that one or more first markers 803a should be added to the timeline 801 so that a corresponding piece of supplemental content is shown when each of the first markers 803a is selected. These update commands may include the supplemental content to be added, a timestamp indicating a location in the timeline 801 where a marker for the supplemental content should be added, and information identifying the corresponding content so that the correct timeline 801 is updated. Further, these update commands may be received from an administrative entity or acquired from the second screen experience manager 340 itself when the second screen experience manager 340 automatically analyzes the reports in step 1308. For example, an administrative entity monitoring a particular piece of content in real-time may detect that an event occurred within the content and may decide to provide supplemental content in response to that event. If so, the administrative entity may provide the supplemental content along with a command to the second screen experience manager 340 so that the second screen experience manager 340 may modify the timeline 801 accordingly. In another example, the administrative entity (or the second screen experience manager 340 itself) may determine that a particular message should be featured, and may provide supplemental content, including the message, along with a command to the second screen experience manager 340 so that the second screen experience manager 340 may modify the timeline 801 accordingly. While
In step 1311, the second screen experience manager 340 may generate instructions that cause a second screen device 302 to modify a timeline 801. For example, when a command is received from an administrative entity in step 1310, the second screen experience manager 340 may generate instructions that cause the second screen device 302 to add a first marker and corresponding supplemental content to the timeline 801. These generated instructions may include computer-executable instructions that the second screen device 302 may process or may provide information that directs the second screen device 302 to execute computer-executable instructions therein. In the latter case, the information may include the supplemental content to be added, a timestamp indicating a location in the timeline 801 where a marker 803 for the supplemental content should be added, and/or information identifying the corresponding content so that the correct timeline 801 is updated. In the former case, the computer-executable instructions may include this information as well as a script or other module of code that, when executed, may instruct the “Watch With Me” application on how to modify its timeline 801 to present the additional supplemental content.
In step 1312, the generated instructions may be transmitted to a second screen device 302. In some cases, the generated instructions may be pushed to the second screen device 302 once they are generated. Thus, second screen devices 302 might not need to request updates in order for the timeline 801 to stay up to date with the latest supplemental content. Alternatively, once the instructions are generated, they may be stored for transmission upon a subsequent request for updates. For example, the instructions may be stored in a memory of the second screen experience manager 340 until the second screen experience manager 340 receives a request from a second screen device 302 for updates to its timeline 801, at which point the second screen experience manager 340 may transmit the appropriate instructions.
Although not shown, it should be understood that one or more of the steps in
Although example embodiments are described above, the various features and steps may be combined, divided, omitted, and/or augmented in any desired manner, depending on the specific secure process desired. For example, the process of
Number | Name | Date | Kind |
---|---|---|---|
5287489 | Nimmo et al. | Feb 1994 | A |
5321750 | Nadan | Jun 1994 | A |
5353121 | Young et al. | Oct 1994 | A |
5485221 | Banker et al. | Jan 1996 | A |
5530939 | Mansfield, Jr. et al. | Jun 1996 | A |
5583563 | Wanderscheid et al. | Dec 1996 | A |
5589892 | Knee et al. | Dec 1996 | A |
5592551 | Lett et al. | Jan 1997 | A |
5594509 | Florin et al. | Jan 1997 | A |
5613057 | Caravel | Mar 1997 | A |
5621456 | Florin et al. | Apr 1997 | A |
5657072 | Aristides et al. | Aug 1997 | A |
5659793 | Escobar et al. | Aug 1997 | A |
5666645 | Thomas et al. | Sep 1997 | A |
5675752 | Scott et al. | Oct 1997 | A |
5694176 | Bruette et al. | Dec 1997 | A |
5802284 | Karlton et al. | Sep 1998 | A |
5826102 | Escobar et al. | Oct 1998 | A |
5844620 | Coleman et al. | Dec 1998 | A |
5850218 | LaJoie et al. | Dec 1998 | A |
5852435 | Vigneaux et al. | Dec 1998 | A |
5860073 | Ferrel et al. | Jan 1999 | A |
5883677 | Hofmann | Mar 1999 | A |
5892902 | Clark | Apr 1999 | A |
5892905 | Brandt et al. | Apr 1999 | A |
5905492 | Straub et al. | May 1999 | A |
5929849 | Kikinis | Jul 1999 | A |
5945987 | Dunn | Aug 1999 | A |
5960194 | Choy et al. | Sep 1999 | A |
5990890 | Etheredge | Nov 1999 | A |
5996025 | Day et al. | Nov 1999 | A |
6002394 | Schein et al. | Dec 1999 | A |
6005561 | Hawkins et al. | Dec 1999 | A |
6008803 | Rowe et al. | Dec 1999 | A |
6008836 | Bruck et al. | Dec 1999 | A |
6016144 | Blonstein et al. | Jan 2000 | A |
6025837 | Matthews, III et al. | Feb 2000 | A |
6049823 | Hwang | Apr 2000 | A |
6061695 | Slivka et al. | May 2000 | A |
6067108 | Yokote et al. | May 2000 | A |
6088722 | Herz et al. | Jul 2000 | A |
6091411 | Straub et al. | Jul 2000 | A |
6094237 | Hashimoto | Jul 2000 | A |
6141003 | Chor et al. | Oct 2000 | A |
6148081 | Szymanski et al. | Nov 2000 | A |
6162697 | Singh et al. | Dec 2000 | A |
6169543 | Wehmeyer | Jan 2001 | B1 |
6172677 | Stautner et al. | Jan 2001 | B1 |
6177931 | Alexander et al. | Jan 2001 | B1 |
6191781 | Chaney et al. | Feb 2001 | B1 |
6195692 | Hsu | Feb 2001 | B1 |
6205582 | Hoarty | Mar 2001 | B1 |
6219839 | Sampsell | Apr 2001 | B1 |
6239795 | Ulrich et al. | May 2001 | B1 |
6240555 | Shoff et al. | May 2001 | B1 |
6281940 | Sciammarella | Aug 2001 | B1 |
6292187 | Gibbs et al. | Sep 2001 | B1 |
6292827 | Raz | Sep 2001 | B1 |
6295057 | Rosin et al. | Sep 2001 | B1 |
6314569 | Chernock et al. | Nov 2001 | B1 |
6317885 | Fries | Nov 2001 | B1 |
6345305 | Beck et al. | Feb 2002 | B1 |
6405239 | Addington et al. | Jun 2002 | B1 |
6415438 | Blackketter et al. | Jul 2002 | B1 |
6421067 | Kamen et al. | Jul 2002 | B1 |
6426779 | Noguchi et al. | Jul 2002 | B1 |
6442755 | Lemmons et al. | Aug 2002 | B1 |
6477705 | Yuen et al. | Nov 2002 | B1 |
6486920 | Arai et al. | Nov 2002 | B2 |
6522342 | Gagnon et al. | Feb 2003 | B1 |
6529950 | Lumelsky et al. | Mar 2003 | B1 |
6530082 | Del Sesto et al. | Mar 2003 | B1 |
6532589 | Proehl et al. | Mar 2003 | B1 |
6564263 | Bergman et al. | May 2003 | B1 |
6567104 | Andrew et al. | May 2003 | B1 |
6571392 | Zigmond et al. | May 2003 | B1 |
6591292 | Morrison et al. | Jul 2003 | B1 |
6621509 | Eiref et al. | Sep 2003 | B1 |
6636887 | Augeri | Oct 2003 | B1 |
6658661 | Arsenault et al. | Dec 2003 | B1 |
6678891 | Wilcox et al. | Jan 2004 | B1 |
6684400 | Goode et al. | Jan 2004 | B1 |
6704359 | Bayrakeri et al. | Mar 2004 | B1 |
6731310 | Craycroft et al. | May 2004 | B2 |
6745367 | Bates et al. | Jun 2004 | B1 |
6760043 | Markel | Jul 2004 | B2 |
6763522 | Kondo et al. | Jul 2004 | B1 |
6766526 | Ellis | Jul 2004 | B1 |
6806887 | Chernock et al. | Oct 2004 | B2 |
6857128 | Borden, IV et al. | Feb 2005 | B1 |
6886029 | Pecus et al. | Apr 2005 | B1 |
6904610 | Bayrakeri et al. | Jun 2005 | B1 |
6910191 | Segerberg et al. | Jun 2005 | B2 |
6918131 | Rautila et al. | Jul 2005 | B1 |
6963880 | Pingte et al. | Nov 2005 | B1 |
7028327 | Dougherty et al. | Apr 2006 | B1 |
7065785 | Shaffer et al. | Jun 2006 | B1 |
7080400 | Navar | Jul 2006 | B1 |
7103904 | Blackketter et al. | Sep 2006 | B1 |
7114170 | Harris et al. | Sep 2006 | B2 |
7134072 | Lovett et al. | Nov 2006 | B1 |
7152236 | Wugofski et al. | Dec 2006 | B1 |
7162694 | Venolia | Jan 2007 | B2 |
7162697 | Markel | Jan 2007 | B2 |
7174512 | Martin et al. | Feb 2007 | B2 |
7197715 | Valeria | Mar 2007 | B1 |
7207057 | Rowe | Apr 2007 | B1 |
7213005 | Mourad et al. | May 2007 | B2 |
7221801 | Jang et al. | May 2007 | B2 |
7237252 | Billmaier | Jun 2007 | B2 |
7293275 | Krieger | Nov 2007 | B1 |
7305696 | Thomas et al. | Dec 2007 | B2 |
7313806 | Williams et al. | Dec 2007 | B1 |
7337457 | Pack et al. | Feb 2008 | B2 |
7360232 | Mitchell | Apr 2008 | B2 |
7363612 | Satuloori et al. | Apr 2008 | B2 |
7406705 | Crinon et al. | Jul 2008 | B2 |
7440967 | Chidlovskii | Oct 2008 | B2 |
7464344 | Carmichael et al. | Dec 2008 | B1 |
7516468 | Deller et al. | Apr 2009 | B1 |
7523180 | DeLuca et al. | Apr 2009 | B1 |
7587415 | Gaurav et al. | Sep 2009 | B2 |
7624416 | Vandermolen et al. | Nov 2009 | B1 |
7640487 | Amielh-Caprioglio et al. | Dec 2009 | B2 |
7702315 | Engstrom et al. | Apr 2010 | B2 |
7703116 | Moreau et al. | Apr 2010 | B1 |
7721307 | Hendricks et al. | May 2010 | B2 |
7743330 | Hendricks et al. | Jun 2010 | B1 |
7752258 | Lewin et al. | Jul 2010 | B2 |
7861259 | Barone, Jr. | Dec 2010 | B2 |
7913286 | Sarachik et al. | Mar 2011 | B2 |
7958528 | Moreau et al. | Jun 2011 | B2 |
7975277 | Jerding et al. | Jul 2011 | B1 |
8006262 | Rodriguez et al. | Aug 2011 | B2 |
8032914 | Rodriguez | Oct 2011 | B2 |
8156533 | Crichton | Apr 2012 | B2 |
8220018 | de Andrade et al. | Jul 2012 | B2 |
8266652 | Roberts et al. | Sep 2012 | B2 |
8296805 | Tabatabai et al. | Oct 2012 | B2 |
8365230 | Chane et al. | Jan 2013 | B2 |
8381259 | Khosla | Feb 2013 | B1 |
8434109 | Kamimaeda et al. | Apr 2013 | B2 |
8448208 | Moreau et al. | May 2013 | B2 |
8660545 | Redford et al. | Feb 2014 | B1 |
8699862 | Sharifi | Apr 2014 | B1 |
8793256 | McIntire | Jul 2014 | B2 |
8850495 | Pan | Sep 2014 | B2 |
8863196 | Patil et al. | Oct 2014 | B2 |
8938675 | Holladay et al. | Jan 2015 | B2 |
8943533 | de Andrade et al. | Jan 2015 | B2 |
8973063 | Spilo et al. | Mar 2015 | B2 |
9021528 | Moreau et al. | Apr 2015 | B2 |
9363560 | Moreau et al. | Jun 2016 | B2 |
9473548 | Chakrovorthy et al. | Oct 2016 | B1 |
9516253 | De Andrade et al. | Dec 2016 | B2 |
20010014206 | Artigalas et al. | Aug 2001 | A1 |
20010027563 | White et al. | Oct 2001 | A1 |
20010049823 | Matey | Dec 2001 | A1 |
20010056573 | Kovac et al. | Dec 2001 | A1 |
20010056577 | Gordon et al. | Dec 2001 | A1 |
20020010928 | Sahota | Jan 2002 | A1 |
20020016969 | Kimble | Feb 2002 | A1 |
20020023270 | Thomas et al. | Feb 2002 | A1 |
20020026642 | Augenbraun et al. | Feb 2002 | A1 |
20020032905 | Sherr et al. | Mar 2002 | A1 |
20020041104 | Graf et al. | Apr 2002 | A1 |
20020042915 | Kubischta et al. | Apr 2002 | A1 |
20020042920 | Thomas et al. | Apr 2002 | A1 |
20020046099 | Frengut et al. | Apr 2002 | A1 |
20020059094 | Hosea et al. | May 2002 | A1 |
20020059586 | Carney et al. | May 2002 | A1 |
20020059629 | Markel | May 2002 | A1 |
20020067376 | Martin et al. | Jun 2002 | A1 |
20020069407 | Fagnani et al. | Jun 2002 | A1 |
20020070978 | Wishoff et al. | Jun 2002 | A1 |
20020078444 | Krewin et al. | Jun 2002 | A1 |
20020078449 | Gordon et al. | Jun 2002 | A1 |
20020083450 | Kamen et al. | Jun 2002 | A1 |
20020100041 | Rosenberg et al. | Jul 2002 | A1 |
20020107973 | Lennon et al. | Aug 2002 | A1 |
20020108121 | Alao et al. | Aug 2002 | A1 |
20020108122 | Alao et al. | Aug 2002 | A1 |
20020120609 | Lang et al. | Aug 2002 | A1 |
20020124254 | Kikinis | Sep 2002 | A1 |
20020144269 | Connelly | Oct 2002 | A1 |
20020144273 | Reto | Oct 2002 | A1 |
20020147645 | Alao et al. | Oct 2002 | A1 |
20020152477 | Goodman et al. | Oct 2002 | A1 |
20020156839 | Peterson et al. | Oct 2002 | A1 |
20020156890 | Carlyle et al. | Oct 2002 | A1 |
20020162120 | Mitchell | Oct 2002 | A1 |
20020169885 | Alao et al. | Nov 2002 | A1 |
20020170059 | Hoang | Nov 2002 | A1 |
20020171691 | Currans et al. | Nov 2002 | A1 |
20020171940 | He et al. | Nov 2002 | A1 |
20020184629 | Sie et al. | Dec 2002 | A1 |
20020188944 | Noble | Dec 2002 | A1 |
20020196268 | Wolff et al. | Dec 2002 | A1 |
20020199187 | Gissin et al. | Dec 2002 | A1 |
20020199190 | Su | Dec 2002 | A1 |
20030001880 | Holtz et al. | Jan 2003 | A1 |
20030005444 | Crinon et al. | Jan 2003 | A1 |
20030005453 | Rodriguez et al. | Jan 2003 | A1 |
20030014752 | Zaslaysky et al. | Jan 2003 | A1 |
20030014753 | Beach et al. | Jan 2003 | A1 |
20030018755 | Masterson et al. | Jan 2003 | A1 |
20030023970 | Panabaker | Jan 2003 | A1 |
20030025832 | Swart et al. | Feb 2003 | A1 |
20030028871 | Wang et al. | Feb 2003 | A1 |
20030028873 | Lemmons | Feb 2003 | A1 |
20030041104 | Wingard et al. | Feb 2003 | A1 |
20030051246 | Wilder et al. | Mar 2003 | A1 |
20030056216 | Wugofski et al. | Mar 2003 | A1 |
20030056218 | Wingard et al. | Mar 2003 | A1 |
20030058948 | Kelly et al. | Mar 2003 | A1 |
20030066081 | Barone et al. | Apr 2003 | A1 |
20030067554 | Klarfeld et al. | Apr 2003 | A1 |
20030068046 | Lindqvist et al. | Apr 2003 | A1 |
20030070170 | Lennon | Apr 2003 | A1 |
20030079226 | Barrett | Apr 2003 | A1 |
20030084443 | Laughlin et al. | May 2003 | A1 |
20030084444 | Ullman et al. | May 2003 | A1 |
20030084449 | Chane et al. | May 2003 | A1 |
20030086694 | Davidsson | May 2003 | A1 |
20030093790 | Logan et al. | May 2003 | A1 |
20030093792 | Labeeb et al. | May 2003 | A1 |
20030097657 | Zhou et al. | May 2003 | A1 |
20030110500 | Rodriguez | Jun 2003 | A1 |
20030110503 | Perkes | Jun 2003 | A1 |
20030115219 | Chadwick | Jun 2003 | A1 |
20030115612 | Mao et al. | Jun 2003 | A1 |
20030126601 | Roberts et al. | Jul 2003 | A1 |
20030132971 | Billmaier et al. | Jul 2003 | A1 |
20030135464 | Mourad et al. | Jul 2003 | A1 |
20030140097 | Schloer | Jul 2003 | A1 |
20030151621 | McEvilly et al. | Aug 2003 | A1 |
20030158777 | Schiff et al. | Aug 2003 | A1 |
20030172370 | Satuloori et al. | Sep 2003 | A1 |
20030177501 | Takahashi et al. | Sep 2003 | A1 |
20030182663 | Gudorf et al. | Sep 2003 | A1 |
20030189668 | Newnam et al. | Oct 2003 | A1 |
20030204814 | Elo et al. | Oct 2003 | A1 |
20030204846 | Breen et al. | Oct 2003 | A1 |
20030204854 | Blackketter et al. | Oct 2003 | A1 |
20030207696 | Willenegger et al. | Nov 2003 | A1 |
20030226141 | Krasnow et al. | Dec 2003 | A1 |
20030229899 | Thompson et al. | Dec 2003 | A1 |
20040003402 | McKenna | Jan 2004 | A1 |
20040003404 | Boston et al. | Jan 2004 | A1 |
20040019900 | Knightbridge et al. | Jan 2004 | A1 |
20040019908 | Williams et al. | Jan 2004 | A1 |
20040022271 | Fichet et al. | Feb 2004 | A1 |
20040024753 | Chane et al. | Feb 2004 | A1 |
20040025180 | Begeja et al. | Feb 2004 | A1 |
20040031015 | Ben-Romdhane et al. | Feb 2004 | A1 |
20040031058 | Reisman | Feb 2004 | A1 |
20040031062 | Lemmons | Feb 2004 | A1 |
20040039754 | Harple | Feb 2004 | A1 |
20040073915 | Dureau | Apr 2004 | A1 |
20040078814 | Allen | Apr 2004 | A1 |
20040107437 | Reichardt et al. | Jun 2004 | A1 |
20040107439 | Hassell et al. | Jun 2004 | A1 |
20040128699 | Delpuch et al. | Jul 2004 | A1 |
20040133923 | Watson et al. | Jul 2004 | A1 |
20040136698 | Mock | Jul 2004 | A1 |
20040168186 | Rector et al. | Aug 2004 | A1 |
20040172648 | Xu et al. | Sep 2004 | A1 |
20040189658 | Dowdy | Sep 2004 | A1 |
20040194136 | Finseth et al. | Sep 2004 | A1 |
20040199578 | Kapczynski et al. | Oct 2004 | A1 |
20040221306 | Noh | Nov 2004 | A1 |
20040224723 | Farcasiu | Nov 2004 | A1 |
20040225751 | Urali | Nov 2004 | A1 |
20040226051 | Carney et al. | Nov 2004 | A1 |
20050005288 | Novak | Jan 2005 | A1 |
20050015796 | Bruckner et al. | Jan 2005 | A1 |
20050015804 | LaJoie et al. | Jan 2005 | A1 |
20050028208 | Ellis et al. | Feb 2005 | A1 |
20050086172 | Stefik | Apr 2005 | A1 |
20050125835 | Wei | Jun 2005 | A1 |
20050149972 | Knudson | Jul 2005 | A1 |
20050155063 | Bayrakeri et al. | Jul 2005 | A1 |
20050259147 | Nam et al. | Nov 2005 | A1 |
20050262542 | DeWeese et al. | Nov 2005 | A1 |
20050283800 | Ellis et al. | Dec 2005 | A1 |
20050287948 | Hellwagner et al. | Dec 2005 | A1 |
20060004743 | Murao et al. | Jan 2006 | A1 |
20060059525 | Jerding et al. | Mar 2006 | A1 |
20060068818 | Leitersdorf et al. | Mar 2006 | A1 |
20060080707 | Laksono | Apr 2006 | A1 |
20060080716 | Nishikawa et al. | Apr 2006 | A1 |
20060104511 | Guo et al. | May 2006 | A1 |
20060105793 | Gutowski et al. | May 2006 | A1 |
20060125962 | Shelton et al. | Jun 2006 | A1 |
20060143191 | Cho et al. | Jun 2006 | A1 |
20060156336 | Knudson et al. | Jul 2006 | A1 |
20060195865 | Fablet | Aug 2006 | A1 |
20060200842 | Chapman et al. | Sep 2006 | A1 |
20060206470 | McIntyre | Sep 2006 | A1 |
20060206912 | Klarfeld et al. | Sep 2006 | A1 |
20060233514 | Weng et al. | Oct 2006 | A1 |
20060248572 | Kitsukama et al. | Nov 2006 | A1 |
20070019001 | Ha | Jan 2007 | A1 |
20070064715 | Lloyd et al. | Mar 2007 | A1 |
20070083538 | Roy et al. | Apr 2007 | A1 |
20070112761 | Xu et al. | May 2007 | A1 |
20070220016 | Estrada et al. | Sep 2007 | A1 |
20070261072 | Boulet et al. | Nov 2007 | A1 |
20070271587 | Rowe | Nov 2007 | A1 |
20080037722 | Klassen | Feb 2008 | A1 |
20080060011 | Kelts | Mar 2008 | A1 |
20080071770 | Schloter et al. | Mar 2008 | A1 |
20080148317 | Opaluch | Jun 2008 | A1 |
20080163304 | Ellis | Jul 2008 | A1 |
20080189740 | Carpenter et al. | Aug 2008 | A1 |
20080196070 | White et al. | Aug 2008 | A1 |
20080235725 | Hendricks | Sep 2008 | A1 |
20080276278 | Krieger et al. | Nov 2008 | A1 |
20080282294 | Carpenter et al. | Nov 2008 | A1 |
20080288644 | Gilfix et al. | Nov 2008 | A1 |
20080301320 | Morris | Dec 2008 | A1 |
20080301732 | Archer et al. | Dec 2008 | A1 |
20080317233 | Rey et al. | Dec 2008 | A1 |
20090019485 | Ellis et al. | Jan 2009 | A1 |
20090024629 | Miyauchi | Jan 2009 | A1 |
20090094632 | Newnam et al. | Apr 2009 | A1 |
20090094651 | Damm et al. | Apr 2009 | A1 |
20090133025 | Malhotra et al. | May 2009 | A1 |
20090164904 | Horowitz et al. | Jun 2009 | A1 |
20090183210 | Andrade | Jul 2009 | A1 |
20090222872 | Schlack | Sep 2009 | A1 |
20090228441 | Sandvik | Sep 2009 | A1 |
20090249427 | Dunnigan et al. | Oct 2009 | A1 |
20090271829 | Larsson et al. | Oct 2009 | A1 |
20090288132 | Hegde | Nov 2009 | A1 |
20090292548 | Van Court | Nov 2009 | A1 |
20100023966 | Shahraray et al. | Jan 2010 | A1 |
20100077057 | Godin et al. | Mar 2010 | A1 |
20100079670 | Frazier et al. | Apr 2010 | A1 |
20100175084 | Ellis et al. | Jul 2010 | A1 |
20100180300 | Carpenter et al. | Jul 2010 | A1 |
20100223640 | Reichardt et al. | Sep 2010 | A1 |
20100250190 | Zhang et al. | Sep 2010 | A1 |
20100251284 | Ellis et al. | Sep 2010 | A1 |
20100257548 | Lee et al. | Oct 2010 | A1 |
20110055282 | Hoving | Mar 2011 | A1 |
20110058101 | Earley et al. | Mar 2011 | A1 |
20110087348 | Wong | Apr 2011 | A1 |
20110093909 | Roberts et al. | Apr 2011 | A1 |
20110131204 | Bodin et al. | Jun 2011 | A1 |
20110209180 | Ellis et al. | Aug 2011 | A1 |
20110211813 | Marks | Sep 2011 | A1 |
20110214143 | Rits et al. | Sep 2011 | A1 |
20110219386 | Hwang et al. | Sep 2011 | A1 |
20110219419 | Reisman | Sep 2011 | A1 |
20110225417 | Maharajh et al. | Sep 2011 | A1 |
20110246495 | Mallinson | Oct 2011 | A1 |
20110247042 | Mallinson | Oct 2011 | A1 |
20120002111 | Sandoval et al. | Jan 2012 | A1 |
20120011550 | Holland | Jan 2012 | A1 |
20120054811 | Spears | Mar 2012 | A1 |
20120117151 | Bill | May 2012 | A1 |
20120192226 | Zimmerman et al. | Jul 2012 | A1 |
20120227073 | Hosein et al. | Sep 2012 | A1 |
20120233646 | Coniglio | Sep 2012 | A1 |
20120295686 | Lockton | Nov 2012 | A1 |
20120324002 | Chen | Dec 2012 | A1 |
20120324494 | Burger et al. | Dec 2012 | A1 |
20120324495 | Matthews, III | Dec 2012 | A1 |
20120324518 | Thomas et al. | Dec 2012 | A1 |
20130014155 | Clarke et al. | Jan 2013 | A1 |
20130040623 | Chun | Feb 2013 | A1 |
20130051770 | Sargent | Feb 2013 | A1 |
20130103446 | Bragdon | Apr 2013 | A1 |
20130110769 | Ito | May 2013 | A1 |
20130111514 | Slavin et al. | May 2013 | A1 |
20130170813 | Woods | Jul 2013 | A1 |
20130176493 | Khosla | Jul 2013 | A1 |
20130198642 | Carney et al. | Aug 2013 | A1 |
20130262997 | Markworth et al. | Oct 2013 | A1 |
20130298038 | Spivack et al. | Nov 2013 | A1 |
20130316716 | Tapia et al. | Nov 2013 | A1 |
20130326570 | Cowper et al. | Dec 2013 | A1 |
20130332839 | Frazier et al. | Dec 2013 | A1 |
20130332852 | Castanho et al. | Dec 2013 | A1 |
20130347018 | Limp et al. | Dec 2013 | A1 |
20130347030 | Oh et al. | Dec 2013 | A1 |
20140006951 | Hunter | Jan 2014 | A1 |
20140009680 | Moon et al. | Jan 2014 | A1 |
20140032473 | Enoki | Jan 2014 | A1 |
20140068648 | Green et al. | Mar 2014 | A1 |
20140075465 | Petrovic | Mar 2014 | A1 |
20140089423 | Jackels | Mar 2014 | A1 |
20140089967 | Mandalia | Mar 2014 | A1 |
20140129570 | Johnson | May 2014 | A1 |
20140149918 | Asokan et al. | May 2014 | A1 |
20140150022 | Oh et al. | May 2014 | A1 |
20140237498 | Ivins | Aug 2014 | A1 |
20140267931 | Gilson et al. | Sep 2014 | A1 |
20140279852 | Chen | Sep 2014 | A1 |
20140280695 | Sharma et al. | Sep 2014 | A1 |
20140282122 | Mathur | Sep 2014 | A1 |
20140325359 | Vehovsky | Oct 2014 | A1 |
20140327677 | Walker | Nov 2014 | A1 |
20140359662 | Packard et al. | Dec 2014 | A1 |
20140365302 | Walker | Dec 2014 | A1 |
20140373032 | Merry et al. | Dec 2014 | A1 |
20150026743 | Kim et al. | Jan 2015 | A1 |
20150263923 | Kruglick | Sep 2015 | A1 |
Number | Date | Country |
---|---|---|
0624039 | Nov 1994 | EP |
0963115 | Dec 1999 | EP |
1058999 | Dec 2000 | EP |
1080582 | Mar 2001 | EP |
2323489 | Sep 1998 | GB |
9963757 | Dec 1999 | WO |
0011869 | Mar 2000 | WO |
0033576 | Jun 2000 | WO |
0110115 | Feb 2001 | WO |
0182613 | Nov 2001 | WO |
02063426 | Aug 2002 | WO |
02063471 | Aug 2002 | WO |
02063851 | Aug 2002 | WO |
02063878 | Aug 2002 | WO |
03009126 | Jan 2003 | WO |
2003026275 | Mar 2003 | WO |
2011053271 | May 2011 | WO |
2012094105 | Jul 2012 | WO |
2012154541 | Nov 2012 | WO |
Entry |
---|
Watchwith webpage; http://www.watchwith.com/content_owners/watchwith_platform_components.jsp (last visited Mar. 12, 2013). |
Matt Duffy; TVplus App reveals content click-through rates north of 10% across sync-enabled programming; http://www.tvplus.com/blog/TVplus-App-reveals-content-click-through-rates-north-of-10-Percent-across-sync-enabled-programming (retrieved from the Wayback Machine on Mar. 12, 2013). |
“In Time for Academy Awards Telecast, Companion TV App Umami Debuts First Real-Time Sharing of a TV Program's Images”; Umami News; http:www.umami.tv/2012-02-23.html (retrieved from the Wayback Machine on Mar. 12, 2013). |
Fernando Pereira, “The MPEG-4 Book”, Prentice Hall, Jul. 10, 2002. |
Michael Adams, “Open Cable Architecture”, Cisco Press, Dec. 3, 1999. |
Andreas Kraft and Klaus Hofrichter, “An Approach for Script-Based Broadcast Application Production”, Springer-Verlag Brling Heidelberg, pp. 74-82, 1999. |
Mark Riehl, “XML and Perl”, Sams, Oct. 16, 2002. |
MetaTV, Inc., PCT/US02/29917 filed Sep. 19, 2002, International Search Report dated Apr. 14, 2003; ISA/US; 6 pages. |
Sylvain Devillers, “Bitstream Syntax Definition Language: an Input to MPEG-21 Content Representation”, Mar. 2001, ISO, ISO/IEC JTC1/SC29/WG11 MPEG01/M7053. |
Shim, et al., “A SMIL Based Graphical Interface for Interactive TV”, Internet Tech. Laboratory Dept. of Comp. Engineering, San Jose State University, pp. 257-266. |
Yoon, et al., “Video Gadget: MPET-7 Based Audio-Visual Content Indexing and Browsing Engine”, LG Electronics Indtitute of Technology, pp. 59-68. |
Boronat F et al: “Multimedia group and inter-stream synchronization techniques: A comparative study”, Information Systems. Pergamon Press. Oxford. GB. vol. 34. No. 1. Mar. 1, 2009 (Mar. 1, 2009). pp. 108-131. XP025644936. |
Extended European Search Report—EP14159227.9—dated Sep. 3, 2014. |
CA Response to Office Action—CA Appl. 2,685,833—Submitted Jul. 17, 2015. |
Canadian Office Action—CA 2,685,833—dated Jan. 22, 2015. |
European Extended Search Report—EP 13192112.4—dated May 11, 2015. |
Response to European Office Action—European Appl. 13192112.4—submitted Dec. 9, 2015. |
CA Office Action—CA App 2,685,833—dated Jan. 27, 2016. |
European Office Action—EP App 14159227.9—dated Jul. 12, 2016. |
Agnieszka Zagozdzinnska et al. “TRIDAQ Systems in HEP Experiments at LHC Accelerator” Kwartalnik Elektroniki I Telekomunikacji, vol. 59, No. 4, Jan. 1, 2013. |
CA Office Action—CA Application 2685833—dated Feb. 8, 2017. |
Mar. 9, 2018—European Office Action—EP 13192112.4. |
Feb. 19, 2018—European Summons to Oral Proceedings—EP 14159227.9. |
Nov. 29, 2017—Canadian Office Action—CA 2,685,833. |
Jul. 31 2018—European Decision to Refuse—14159227.9. |
Sep. 5, 2019—Canadian Office Action—CA 2,685,833. |
Nov. 6, 2019—Canadian Office Action—CA 2,832,800. |
Number | Date | Country | |
---|---|---|---|
20130198642 A1 | Aug 2013 | US |