Currently, when watching live events on television (TV), pulling up live information on the event (e.g., stats, live data/results) is something that cannot be done on the TV alongside the video content being displayed. Furthermore, there is no interactive option by which the player can specify and prioritize what data they would like to see.
Previous solutions required a user (also referred to as a watcher) to have a separate companion device to monitor relevant events outside the broadcast on a second screen, such as a table computing device or a smart phone.
A system is provided for displaying video content and a companion panel displayed alongside the video content. The companion panel can be or include a user interface (UI) element that is shown while watching the video content, that serves as an interactive information display with info/data related to the video content being played. In such a case, the companion panel may also be referred to as an activity panel. For example, when watching a live football game in ESPN, the companion or activity panel may show the live box score, scoring leaders and other game stats synced with the video stream. Furthermore, this companion or activity panel may be used for interactive actions such as participating in live polls and/or viewing fantasy team information, which are contextual (but not limited) to the live game being shown.
In accordance with an embodiment, a client system obtains video content data from a remote system and obtains or determines corresponding video time data. Additionally, the client system obtains contextual content data and corresponding contextual time data from a remote system. The client system identifies portions of the contextual content data that are temporally related to portions of the video content data based on the contextual time data and the video time data. Further, the client system displays a portion of the video content data on the display device. Additionally, based on results of the identifying, while displaying the portion of the video content data on the display device, the client system also displays, alongside the portion of the video content data, a portion of the contextual content data that is relevant to the portion of the video content data being displayed on the display device. Such contextual content data can be displayed, e.g., in a companion panel.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
A system is disclosed for displaying video content and a companion panel displayed next to or over the video content. The companion panel includes contextual content data relating to the video content. The companion panel may be interactive so that a user can select a link in the companion panel to explore the linked data in greater detail.
In embodiments, the video may be a linear TV broadcast rendered as full-screen HDMI pass-through video by a user's client computing device (also referred to as a client system) onto a user's TV. The video may include any of a variety of different types of content, such as for example a sporting event. Other types of content are contemplated.
In embodiments, the video may be identified, and thereafter a search may be performed to identify information relating to the video. This identification and search may be performed by a user's client computing device or a central service that is linked to the client's computing device by a network such as for example the Internet.
In embodiments, the video may for example be identified using electronic program guide (EPG) data and metadata relating to the scheduled TV program video that the user is viewing. Alternatively, the central service may keep track of the content being displayed which is able to identify and provide information relating to the video. The client computing device or central service may use the TV program ID from the EPG to query for data or data feeds relevant to the identified TV program video. This query may be performed in computers of the central service or over the World Wide Web in general. The program ID and/or keywords from the metadata associated with the program in the EPG or from the central service may be used as keyword searches to identify relevant data, events and/or data feeds.
It is understood that this information may come from a variety of other sources and may be accumulated in a variety of other manners in further embodiments. The information may be contextual live data information synced with video stream. For example, utilizing score data and stats data feeds delivered through the central service, the live information for the event may be synced with the video feed and delivered to the user as a unified experience.
Referring to
In certain embodiments, the companion panel 102 is interactive, in which case it can also be referred to as an activity panel. Upon a user selecting the companion panel 102, such as for example via a selection device, the companion panel 102 may present additional information in the companion panel on the selected topic. Such a selection device can be a game controller, a remote control, a touchpad, a track-pad, a mouse, or a microphone that accepts voice commands, but is not limited thereto.
Moreover, companion panel 102 may include interactive elements which will correspond to the video stream. These may be curated programmatically and manually by a live operations team. In a programmatic example, while watching a live game, before the game is scheduled to start, a live poll questions may be presented such as, “Who do you think will win?” with the two teams listed as options. The user can make their selection and see global pick trends. In a manual example, using the same sporting event example, half-way through the game a live operations personnel may post a situation-based question through a live publishing tool such as: “Do you agree with the referee's decision on ejecting [PlayerX] from the game?” with a list of possible answers.
The companion panel 102 may provide a variety of additional links and information, including news stories, and historical, statistical and biographical information. In one example, the central service or other cloud services may use the cloud data to query for relevant related IPTV video content which may then also be displayed as part of the expanded companion panel 102.
In an alternative embodiment, selection of a link in the companion panel 102 may bring up additional information that is displayed on a second client device, instead of the same device that is displaying the underlying video content 100. SmartGlass, for example, is a known software platform allowing information to be viewed on a second connected device.
The high level flow diagram in
Referring to
Still referring to
As will be discussed in additional detail below, a client system can received contextual content data corresponding to a specific point in time (which is received at the client system from a remote system at step 206) prior to the client system receiving the corresponding video content data (which is received at the client system from a remote system at step 204). This may happen, e.g., because the video content data corresponding to a point in time is likely larger than the contextual content data for that same point in time, and thus, may take longer to be transferred from a remote system to the client system. This may also happed if the transmission from a remote system of the video content data is delayed longer than the transmission of the contextual content data from a remote system. Accordingly, while step 206 is shown as following step 204 in
Still referring to
Still referring to
Multiple instances of each of the steps shown in
In accordance with certain embodiments, the contextual content data, that is displayed alongside the portion of the video content data being displayed, includes one or more interactive elements relevant to the portion of the video content data being displayed on the display device. For example, the interactive element may be or include a polling question. Continuing with the example where a football game is being displayed, an exemplary polling question asked just prior to the football game beginning, and/or at one or more times during the game (e.g., at halftime), is “Who do you think will win?” with the two teams listed as options. The user can make their selection and see global pick trends. In a manual example, using the same sporting event example, half-way through the game a live operations personnel may post a situation-based question through a live publishing tool such as: “Do you agree with the referee's decision on ejecting [PlayerX] from the game?” with a list of possible answers.
Other interactive elements of contextual content data may include buttons that enable the user to view additional contextual data relevant to an event that is being or was just displayed to the user. For example, assuming the scoring of a touchdown was just displayed, interactive elements of contextual content data may include options for viewing a replay, viewing additional information about the player that scored the touchdown (e.g., information about how many touchdowns that player has scored during the current game, during the current season and/or during that player's career), and/or viewing a list of team touchdown leaders, league touchdown leaders and/or touchdown leaders for the specific position (e.g., running back or wide receiver) of the player that just scored the touchdown. A further type of interactive element of the contextual content data is an option for enabling the user to obtain additional contextual content data not currently being displayed. For example, a button can be presented to the user that says “see more options,” or the like. In response to such a button being selected by the user, options for addition relevant contextual content data can be presented to the user, from which the user can make a selection. The additional contextual data may have already been received and stored by the client system, or the client system may send requests for the additional contextual data to a remote system.
Assuming a user participated in a fantasy football league, another interactive element of the contextual content data can be an option for the user to obtain information related to a fantasy football league in which the user participates. For example, in response to such an option being selected, highlights of players on the user's fantasy team may be displayed, and/or a listing of points earned by the players on the user's team can be displayed. These are just a few examples, which are not meant to be all encompassing.
Additional details of step 204 will now be described with referenced to
Still referring to
Additional details of step 206 will now be described with referenced to
In order to collect contextual content data, a remote system can, e.g., use a TV program ID from an EPG to query for data or data feeds relevant to particular video content data. Such queries may be performed, e.g., by one or computers of the remote system. The program ID and/or keywords from the metadata associated with the program in the EPG may be used as keyword searches to identify relevant contextual data. The contextual content data may come from a variety of different sources and may be accumulated in a variety of different manners. Since the present technology is not primarily focused on how a remote system obtains contextual data, additional details of how a remote system may obtain contextual data is not provide herein.
Additional details of step 208 will now be described with referenced to
As mentioned above, contextual content data corresponding to a specific point in time (which is received at the client system from a remote system, e.g., at step 206) may be received prior to corresponding video content data (which is received at the client system from a remote system, e.g., at step 204). This may happen, as mentioned above, because the video content data corresponding to a point in time is likely larger than the contextual content data for that same point in time, and/or transmission of video content data may be delayed longer than the transmission of the contextual content data from a remote system. Accordingly, the client system may temporarily store contextual content data to have the contextual content data available to it and ready to be displayed as soon as the video content data being displayed catches up to the contextual content data.
Still referring to
In accordance with an embodiment, whether a user watches content (e.g., a football game) substantially live, or a few days later, the content displayed to the user is the same. In other words, if a user chooses to view a football game that took place two days earlier, the viewing experience provided to the user will be the same as if the user viewed the football game during the same time the game was actually being played. A benefit of this is that contextual content data will not accidentally include “spoilers”, such as identifying the team what already won the game.
According to one embodiment, computing system 412 may be connected to an audio/visual device 416 such as a television, a monitor, a high-definition television (HDTV), or the like that may provide television, movie, video, game or application visuals and/or audio to a user. For example, the computing system 412 may include a video adapter such as a graphics card and/or an audio adapter such as a sound card that may provide audiovisual signals associated with the game application, non-game application, or the like. The audio/visual device 416 may receive the audio/visual signals from the computing system 412 and may then output the television, movie, video, game or application visuals and/or audio to the user. According to one embodiment, audio/visual device 416 may be connected to the computing system 412 via, for example, an S-Video cable, a coaxial cable, an HDMI cable, a DVI cable, a VGA cable, component video cable, or the like. Audio/visual device 416 may be used to display the video content 100 and contextual companion panel 102 described above.
Entertainment system 400 may be used to recognize, analyze, and/or track one or more humans. For example, a user may be tracked using the capture device 420 such that the gestures and/or movements of user may be captured to animate an avatar or on-screen character and/or may be interpreted as controls that may be used to affect the application being executed by computing system 412. Thus, according to one embodiment, a user may move his or her body (e.g., using gestures) to control the interaction with a program being displayed on audio/visual device 416.
A graphics processing unit (GPU) 508 and a video encoder/video codec (coder/decoder) 514 form a video processing pipeline for high speed and high resolution graphics processing. Data is carried from the graphics processing unit 508 to the video encoder/video codec 514 via a bus. The video processing pipeline outputs data to an A/V (audio/video) port 540 for transmission to a television or other display. A memory controller 510 is connected to the GPU 508 to facilitate processor access to various types of memory 512, such as, but not limited to, a RAM (Random Access Memory).
The multimedia console 500 includes an I/O controller 520, a system management controller 522, an audio processing unit 523, a network (or communication) interface 524, a first USB host controller 526, a second USB controller 528 and a front panel I/O subassembly 530 that are preferably implemented on a module 518. The USB controllers 526 and 528 serve as hosts for peripheral controllers 542(1)-542(2), a wireless adapter 548 (another example of a communication interface), and an external memory device 546 (e.g., flash memory, external CD/DVD ROM drive, removable media, etc. any of which may be non-volatile storage). The network interface 524 and/or wireless adapter 548 provide access to a network (e.g., the Internet, home network, etc.) and may be any of a wide variety of various wired or wireless adapter components including an Ethernet card, a modem, a Bluetooth module, a cable modem, and the like. For a more specific example, the network interface 524 may enable a client system, e.g., 500, to communicate with a remote system that can provide the client system with video content data and/or contextual content data in accordance with embodiments described herein.
System memory 543 is provided to store application data that is loaded during the boot process. A media drive 544 is provided and may comprise a DVD/CD drive, Blu-Ray drive, hard disk drive, or other removable media drive, etc. (any of which may be non-volatile storage). The media drive 144 may be internal or external to the multimedia console 500. Application data may be accessed via the media drive 544 for execution, playback, etc. by the multimedia console 500. The media drive 544 is connected to the I/O controller 520 via a bus, such as a Serial ATA bus or other high speed connection (e.g., IEEE 1394).
The system management controller 522 provides a variety of service functions related to assuring availability of the multimedia console 500. The audio processing unit 523 and an audio codec 532 form a corresponding audio processing pipeline with high fidelity and stereo processing. Audio data is carried between the audio processing unit 523 and the audio codec 532 via a communication link. The audio processing pipeline outputs data to the A/V port 540 for reproduction by an external audio user or device having audio capabilities.
The front panel I/O subassembly 530 supports the functionality of the power button 550 and the eject button 552, as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the multimedia console 500. A system power supply module 536 provides power to the components of the multimedia console 500. A fan 538 cools the circuitry within the multimedia console 500.
The CPU 501, GPU 508, memory controller 510, and various other components within the multimedia console 500 are interconnected via one or more buses, including serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures can include a Peripheral Component Interconnects (PCI) bus, PCI-Express bus, etc.
When the multimedia console 500 is powered on, application data may be loaded from the system memory 543 into memory 512 and/or caches 502, 504 and executed on the CPU 501. The application may present a graphical user interface that provides a consistent user experience when navigating to different media types available on the multimedia console 500. In operation, applications and/or other media contained within the media drive 544 may be launched or played from the media drive 544 to provide additional functionalities to the multimedia console 500.
The multimedia console 500 may be operated as a standalone system by simply connecting the system to a television or other display. In this standalone mode, the multimedia console 500 allows one or more users to interact with the system, watch movies, or listen to music. However, with the integration of broadband connectivity made available through the network interface 524 or the wireless adapter 548, the multimedia console 500 may further be operated as a participant in a larger network community.
When the multimedia console 500 is powered ON, a set amount of hardware resources are reserved for system use by the multimedia console operating system. These resources may include a reservation of memory, CPU and GPU cycle, networking bandwidth, etc. Because these resources are reserved at system boot time, the reserved resources do not exist from the application's view. In particular, the memory reservation preferably is large enough to contain the launch kernel, concurrent system applications and drivers. The CPU reservation is preferably constant such that if the reserved CPU usage is not used by the system applications, an idle thread will consume any unused cycles.
With regard to the GPU reservation, lightweight messages generated by the system applications (e.g., pop ups) are displayed by using a GPU interrupt to schedule code to render popup into an overlay. The amount of memory required for an overlay depends on the overlay area size and the overlay preferably scales with screen resolution. Where a full user interface is used by the concurrent system application, it is preferable to use a resolution independent of application resolution. A scaler may be used to set this resolution such that the need to change frequency and cause a TV resync is eliminated.
After multimedia console 500 boots and system resources are reserved, concurrent system applications execute to provide system functionalities. The system functionalities are encapsulated in a set of system applications that execute within the reserved system resources described above. The operating system kernel identifies threads that are system application threads versus gaming application threads. The system applications are preferably scheduled to run on the CPU 501 at predetermined times and intervals in order to provide a consistent system resource view to the application. The scheduling is to minimize cache disruption for the gaming application running on the console.
When a concurrent system application requires audio, audio processing is scheduled asynchronously to the gaming application due to time sensitivity. A multimedia console application manager (described below) controls the gaming application audio level (e.g., mute, attenuate) when system applications are active.
Optional input devices (e.g., controllers 542(1) and 542(2)) are shared by gaming applications and system applications. The input devices are not reserved resources, but are to be switched between system applications and the gaming application such that each will have a focus of the device. The application manager preferably controls the switching of input stream, without knowing the gaming application's knowledge and a driver maintains state information regarding focus switches. Capture device 420 may define additional input devices for the console 500 via USB controller 526 or other interface. In other embodiments, computing system 412 can be implemented using other hardware architectures. No one hardware architecture is required.
Computing system 620 comprises a computer 641, which typically includes a variety of computer readable media. The computer 641 is an example of a client system. Computer readable media can be any available media that can be accessed by computer 641 and includes both volatile and nonvolatile media, removable and non-removable media. The system memory 622 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 623 and random access memory (RAM) 660. A basic input/output system 624 (BIOS), containing the basic routines that help to transfer information between elements within computer 641, such as during start-up, is typically stored in ROM 623. RAM 660 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 659. By way of example, and not limitation,
The computer 641 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media discussed above and illustrated in
The computer 641 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 646, which is an example of a remote system from which a client system can receive video content data and/or contextual content data. The remote computer 646 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 641, although only a memory storage device 647 has been illustrated in
When used in a LAN networking environment, the computer 641 is connected to the LAN 645 through a network interface 637. When used in a WAN networking environment, the computer 641 typically includes a modem 650 or other means for establishing communications over the WAN 649, such as the Internet. The modem 650, which may be internal or external, may be connected to the system bus 621 via the user input interface 636, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 641, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. It is intended that the scope of the invention be defined by the claims appended hereto.
This application claims priority to U.S. Provisional Patent Application No. 61/816,691, filed Apr. 26, 2013, which is incorporate herein by reference.
Number | Date | Country | |
---|---|---|---|
61816691 | Apr 2013 | US |