Watching streamed media content is increasingly popular and commonplace. For example, a viewer may subscribe to a streaming service and watch media items offered by that service from one or more display devices associated with the viewer's subscription account. To illustrate, the viewer may watch a selected media item (e.g., a TV show or movie) on a television. For example, the viewer's television may be wirelessly enabled to receive streamed content directly from the streaming service. In some examples, the viewer may have their television connected to any type of set-top receiver that receives the streamed media content and displays media content items via the television.
In addition to viewing streamed media content via a television, some streaming services offer additional ways to watch streamed media content. For example, some streaming services can stream media items to other types of display devices such as smartphones, tablet computers, laptop computers, smart wearables, and so forth. With these additional viewing options, viewers may watch streamed media items in multiple ways.
Despite this, most streaming services limit viewer interactions with media content items to one display device at a time. For example, most streaming services allow viewers to browse, select, watch, etc. media content items on their TV (e.g., by using a remote control) or on their smartphone (e.g., using touch gestures), but not both simultaneously. As such, these streaming services fail to provide a way for viewers to interact with media content items simultaneously via more than one display device.
As will be described in greater detail below, the present disclosure describes implementations that enable viewers to have rich interactions with media items via more than one display device. For example, implementations include pairing a first screen device displaying a media item and a second screen device acting as a controller relative to the media item, determining a context associated with the media item, generating, based on the context, instructions to modify one or more components of a playback control graphical user interface displayed by the second screen device, and transmitting the instructions to the second screen device to trigger modification of the one or more components of the playback control graphical user interface.
In some examples, the implementations further include pairing the first screen device displaying the media item and the second screen device acting as the controller by receiving a device subscription request from the first screen device, generating a device subscription list including the first screen device, receiving a device subscription request from the second screen device, adding the second screen device to the device subscription list, transmitting the device subscription list to the first screen device, and in response to receiving a request from the first screen device to pair with the second screen device, associating the first screen device and the second screen device.
Some examples further include, in response to associating the first screen device and the second screen device, receiving one or more messages from the first screen device and pushing the one or more messages to the second screen device. In at least one example, pairing the first screen device and the second screen device is in response to a detected selection of the first screen device from a pairing option list displayed by the second screen device.
Some examples also include, in response to pairing the first screen device and the second screen device, causing the second screen device to display an indication of successful pairing within the playback control graphical user interface displayed by the second screen device and causing the first screen device to display the indication of successful pairing within a media item graphical user interface displayed by the first screen device.
In some examples, implementations further include generating instructions to display the playback control graphical user interface on the second screen device, wherein the instructions to display the playback control graphical user interface are based on whether the media item is a standalone media item or an episodic media item. In at least one example, determining the context associated with the media item includes at least one of: determining that playback of the media item has been initiated, determining that playback of the media item has entered an introduction portion of the media item, determining that the media item is a last episode in a collection of episodes, or determining that a subtitle selection associated with the media item has been made.
In some examples, implementations further include detecting one or more user selections of components within the playback control graphical user interface on the second screen device, and communicating the detected one or more selections to the first screen device. In at least one example, generating instructions to modify one or more components of the playback control graphical user interface includes one or more of: generating instructions to provide a new component within the playback control graphical user interface, generating instructions to highlight an existing component within the playback control graphical user interface, generating instructions to modify an existing component within the playback control graphical user interface, or generating instructions to update a position of a playback scrubber within the playback control graphical user interface.
Some examples described herein include a system with at least one physical processor and physical memory including computer-executable instructions that, when executed by the at least one physical processor, cause the at least one physical processor to perform various acts. In at least one example, the computer-executable instructions, when executed by the at least one physical processor, cause the at least one physical processor to perform acts including pairing a first screen device displaying a media item and a second screen device acting as a controller relative to the media item, determining a context associated with the media item, generating, based on the context, instructions to modify one or more components of a playback control graphical user interface displayed by the second screen device, and transmitting the instructions to the second screen device to trigger modification of the one or more components of the playback control graphical user interface.
In additional examples, implementations include receiving, at a second screen device via digital content system application, instructions to generate a control graphical user interface associated with a media item displayed on a first screen device, transmitting, in response to a detected selection of a pause/play button within the control graphical user interface, an electronic communication that initiates playback of the media item, receiving, at the second screen device via the digital content system application, instructions that trigger one or more modifications of the control graphical user interface based on playback of the media item entering an introduction portion of the media item, and modifying the control graphical user interface based on the received instructions. Some implementations further include receiving, at the second screen device via the digital content system application, additional instructions that trigger additional modifications of the control graphical user interface based on playback of the media item moving past the introduction portion of the media item, and modifying the control graphical user interface based on the received additional instructions.
In one or more examples, features from any of the implementations described herein are used in combination with one another in accordance with the general principles described herein. These and other implementations, features, and advantages will be more fully understood upon reading the following detailed description in conjunction with the accompanying drawings and claims.
The accompanying drawings illustrate a number of exemplary implementations and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the present disclosure.
Throughout the drawings, identical reference characters and descriptions indicate similar, but not necessarily identical, elements. While the exemplary embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the present disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.
As mentioned above, most digital content streaming services limit the ways that a viewer can interact with a media item. To illustrate, a typical digital content streaming service enables viewers to use a remote control to browse media items, select media items, and initiate media item playback via their televisions. Similarly, the typical digital content streaming service enables viewers to use touch gestures to browse, select, and play media items via their smartphone or tablet computer. As such, typical digital content streaming services limit the streaming experiences of their viewers to one single screen device at a time-meaning viewers often have to pause playback of a media item to change language and subtitle settings, explore additional episode summaries, browse additional media items, etc. Moreover, some digital content streaming services often provide language-changing services, episode overviews, subtitle selections, etc. in a way that overlays the main playback window such that playback of a media item is obscured from view.
In light of this, the present disclosure describes a system that enables simultaneous interactions with a single media item via two or more screen devices. For example, the disclosed system pairs a first screen device (e.g., a TV) and a second screen device (e.g., a smartphone). The disclosed system then enables interactions with a media item (e.g., a digital movie) via both display devices. To illustrate, the disclosed system enables a viewer to browse and select the media item via the second screen device for playback on the first screen device. During playback, the disclosed system further generates a context-aware playback control graphical user interface on the second screen device via which the viewer continues to interact with the media item by pausing and scrubbing media item playback, skipping predetermined portions of the media item, configuring language and subtitle settings, advancing episodes, and so forth. Additionally, the disclosed system utilizes the playback control graphical user interface on the second screen device such that playback on the first screen device is unobscured and easy to view. As such, the disclosed system leverages multiple screen devices to enrich and deepen the viewer's experience with the media item and to add flexibility and additional functionality to media item playback beyond what was previously available.
Features from any of the implementations described herein may be used in combination with one another in accordance with the general principles described herein. These and other implementations, features, and advantages will be more fully understood upon reading the following detailed description in conjunction with the accompanying drawings and claims.
The following will provide, with reference to
As just mentioned,
In one or more implementations, as shown in
As further shown in
In some implementations, as mentioned above, the digital content system application 118a, 118b is installed on the first screen device 114 and/or the second screen device 124. For example, in one implementation, the first screen device 114 and the second screen device 124 generate and transmit electronic messages via the digital content system applications 118a, 118b, respectively. Moreover, in at least one implementation, the first screen device 114 and the second screen device 124 receive electronic transmissions from the dynamic control system 102 via the digital content system applications 118a, 118b, respectively.
As mentioned above, the first screen device 114 and/or the second screen device 124 may be communicatively coupled with the server(s) 112 through the network 126. In one or more implementations, the network 126 may represent any type or form of communication network, such as the Internet, and may include one or more physical connections, such as a LAN, and/or wireless connections, such as a WAN. In some implementations, the network 126 may represent a telecommunications carrier network. In at least one implementation, the network 126 may represent combinations of networks such that the first screen device 114 communicates with the digital content system 104 via a first network while the second screen device 124 communicates with the dynamic control system 102 via a second network.
Although
As mentioned above,
As illustrated in
As further illustrated in
Moreover, at step 206 the dynamic control system 102 generates, based on the context, instructions to modify one or more components of a playback control graphical user interface displayed by the second screen device. In one example, the dynamic control system 102 generates the instructions to modify the one or more components of the playback graphical user interface by generating instructions to provide a new component within the playback control graphical user interface, highlight an existing component within the playback control graphical user interface, modify an existing component within the playback control graphical user interface, and/or update a position of a playback scrubber within the playback control graphical user interface.
Additionally, as shown in
In one or more implementations, and as will be explained in greater detail below, the methods and steps performed by the dynamic control system 102 reference multiple terms. For example, as used herein, a “screen device” refers to a computing device that has a display. In some implementations, the screen device is interactive (e.g., such as a touch screen display). In some implementations, the screen device is associated with a remote control (e.g., such as with a television). In most examples, a screen device includes network capabilities such that it can connect to the dynamic control system 102 over the network 126.
As used herein, a “media item” refers to a digital audio and/or visual information that plays on a screen device. In some examples, a media item is a standalone media item such as a special or a movie. In some examples, a media item is an episodic media item such as a portion of a TV show that is part of a season.
As used herein, a “controller” refers to a device that acts as an intermediary between a viewer and a display device. In one example, a controller includes interactive components that enable a viewer to initiate playback of a media item on a display device, control the volume of the display device, navigate through menu options displayed on the display device, and so forth.
As used herein, “context” refers to a current circumstance, posture, or setting associated with a media item. In one or more examples, a media item's context includes its type, its playback status, one or more setting configurations associated with the media item, etc. Additionally, as used herein, “instructions” refer to an electronic message or encoding that informs one or more members of the exemplary networking environment 100 of a change. In one example, instructions refer to an electronic message originating from the second screen device 124 informing the first screen device 114 that a particular component of the playback control graphical user interface has been selected. In another example, instructions refer to an electronic encoding generated by the dynamic control system 102 (e.g., operating on the second screen device 124) that triggers the second screen device 124 to modify the playback control graphical user interface in response to receiving playback position information from the first screen device 114.
As mentioned above, the dynamic control system 102 pairs the first screen device 114 and the second screen device 124 and facilitates communications between the first screen device 114 and the second screen device 124. In many examples, the dynamic control system 102 further determines a context or context changes relative to media items and generates instructions that cause one or more of the first screen device 114 and the second screen device 124 to trigger user interface modifications based on those context changes.
In more detail,
In at least one implementation, the dynamic control system 102 overlays the pairing option list user interface 304 in response to a detected selection of a pairing control 303. In the example described throughout
As shown in
As shown in
In response to a successful pairing (e.g., following a detected selection of the pairing option 306a as shown in
In some implementations, as shown in
In one or more implementations, the dynamic control system 102 also generates updates for one or more user interfaces on the selected screen device (e.g., the first screen device 114) during the pairing process. For example, as shown in
In one or more implementations, the dynamic control system 102 generates—or causes the second screen device 124 to generate—a playback control graphical user interface that enables a user of the second screen device 124 to control playback of a media item playing on the first screen device 114.
For example, as shown in
In an alternative example, the dynamic control system 102 causes the second screen device 124 to generate a playback control graphical user interface 404 in response to a detected selection of the pairing control 303 (e.g., shown in
In more detail, the components of the playback control graphical user interface 404 enable the viewer to interact with the media item and/or the digital content system 104 in different ways. For example, the dynamic control system 102 provides search capabilities in response to detected interaction with the search box 406. Moreover, the dynamic control system 102 provides list displays of additional media items in response to detected interactions with any of the media item options 408a-408d. Furthermore, the dynamic control system 102 enables interactions with digital content system 104 menus and options in response to detected selection of any of the portions of the LRUD control 410. The dynamic control system 102 also enables navigations through the digital content system application 118b on the second screen device 124 in response to detected selections of the back button 412 and/or home button 416. Additionally, the dynamic control system 102 enables volume control on the first screen device 114 in response to detected interactions with either of the volume buttons 414a, 414b.
In further detail, as shown in
In one or more examples, as shown in
In an alternative example, the dynamic control system 102 enables the viewer to interact with an on-screen keyboard 424 on the first screen device 114 by manipulating one or more components of the playback control graphical user interface 404 on the second screen device 124. For example, in response to detected selections of the LRUD control 410 on the second screen device 124, the dynamic control system 102 can indicate key selections in the on-screen keyboard 424 on the first screen device 114 and can navigate among/select a media title from among the titles included in the search result listing 426. Thus, the dynamic control system 102 enables the viewer to input a search term 428 to identify a specific media item.
As further shown in
In some implementations, the dynamic control system 102 further disconnects the second screen device 124 from the first screen device 114 in response to additional user selections on the second screen device 124. For example, as shown in
In one or more implementations, as shown in
In some examples, as mentioned above, the dynamic control system 102 dynamically modifies the playback control graphical user interface 404 in response to determining a context or context change associated with a media item.
Additionally, as shown in
In one or more examples, the dynamic control system 102 dynamically modifies components of the playback control graphical user interface 404 based on context changes associated with a media item. To illustrate, in one example, as shown in
In response to determining that the current media item is an episode, the dynamic control system 102 provides a different selection of components, as further shown in
In additional examples, the dynamic control system 102 provides or modifies additional components of the playback control graphical user interface 404 based on a context or contextual changes associated with a media item. For example, as shown in
In one or more implementations, the dynamic control system 102 provides additional functionality within the playback control graphical user interface 404 during playback of a media item. For example, in response to a detected selection of the LRUD control switch 514, the dynamic control system 102 modifies the playback control graphical user interface 404 by adding the LRUD control 410 as shown in
In at least one example, the dynamic control system 102 enables media item scrubbing on the second screen device 124 while the media item playback proceeds normally on the first screen device 114. For example, as shown in
In one or more implementations, the dynamic control system 102 modifies or disables components of the playback control graphical user interface 404 based on the context of a media item. For example, as shown in
In at least one implementation, the dynamic control system 102 supports various language options associated with media items. For example, as shown in
Moreover, as shown in
In some examples, the dynamic control system 102 causes the second screen device 124 to continuously display the playback control graphical user interface 404 while a current media item is playing on the first screen device 114. In additional examples, the dynamic control system 102 allows the second screen device 124 to lock after a predetermined amount of time where no activity is detected via the playback control graphical user interface 404. In that example, the dynamic control system 102 further provides additional information via a lock screen of the 124.
To illustrate,
In additional implementations, the dynamic control system 102 adds additional or different media item information on the lock screen 602 of the second screen device 124. In one example, as shown in
As mentioned above, the dynamic control system 102 enables the second screen device 124 to act as a controller in connection with a media item playing on the first screen device 114 by first pairing the second screen device 124 and the first screen device 114.
At step 708, the server(s) 112 receives an additional subscription request from the second screen device 124. As with the subscription request from the first screen device 114, this additional subscription request includes identification information that is unique to the second screen device 124. At step 710, the server(s) 112 further transmits the additional subscription request to the dynamic control system 102, where the dynamic control system 102 adds the second screen device 124 to the device subscription list at step 712.
With the addition of a new device to the device subscription list, the dynamic control system 102 determines that the device subscription list has changed at step 714. In response to this determination, the dynamic control system 102 transmits the updated subscription list to one or more devices indicated by the same list. Thus, at step 716, the dynamic control system 102 transmits the updated device subscription list to the server(s) 112, which forwards the updated device subscription list to at least the first screen device 114 at step 718.
In some implementations, devices may also periodically request the device subscription list. For example, at step 720, the second screen device 124 submits a request for the device subscription list to the server(s) 112. The server(s) 112 requests and receives the updated device subscription list from the dynamic control system 102 at steps 722 and 724. The server(s) 112 can then transmit the updated device subscription list to the second screen device 124 at step 726—thereby pairing the first screen device 114 and the second screen device 124. In at least one implementation, the dynamic control system 102 pairs the first screen device 114 and the second screen device 124 only when both devices are signed into the same digital content system account.
Moreover, in some implementations, the dynamic control system 102 includes and utilizes additional information in the device subscription list. To illustrate, in one example, the dynamic control system 102 tracks information identifying a specific profile within the digital content system account into which a screen device is signed. In that example, the digital content system account includes multiple profiles (e.g., for family members within a household). As such, when the second screen device 124 determines which devices to include for pairing (e.g., within the pairing option list user interface 304 as shown in
In at least one implementation, the first screen device 114 and the second screen device 124 are paired once both devices have their identifying information on the device subscription list. At this point, as shown in
As mentioned above, and as shown in
In certain implementations, the dynamic control system 102 represents one or more software applications, modules, or programs that, when executed by a computing device, may cause the computing device to perform one or more tasks. For example, and as will be described in greater detail below, one or more of the pairing manager 802, the communication manager 804, and the modification manager 806 may represent software stored and configured to run on one or more computing devices, such as the server(s) 112, the first screen device 114 and/or the second screen device 124. One or more of the pairing manager 802, the communication manager 804, and the modification manager 806 of the dynamic control system 102 shown in
As mentioned above, and as shown in
As mentioned above, and as shown in
As mentioned above, and as shown in
As shown in
Additionally, the first screen device 114 and the second screen device 124 include the memories 116a and 116b while the server(s) 112 include the memory 106. In one or more implementations, the memories 116a, 116b, and 106 generally represent any type or form of volatile or non-volatile storage device or medium capable of storing data and/or computer readable instructions. In one example, the memories 116a, 116b, and 106 may store, load, and/or maintain one or more of the components of the dynamic control system 102. Examples of the memories 116a, 116b, and 106 can include, without limitation, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, Hard Disk Drives (HDDs), Solid-State Drives (SSDs), optical disk drives, caches, variations or combinations of one or more of the same, and/or any other suitable storage memory.
Moreover, as shown in
In summary, the dynamic control system 102 enables viewers to simultaneously interact with media items via two or more display devices. For example, as discussed above, the dynamic control system 102 pairs display devices under the same digital content system account—even if those devices are not on the same network. Once the display devices are paired, the dynamic control system 102 causes modifications to one or more graphical user interfaces based on a context of a media item. In this way, the dynamic control system 102 dynamically provides user interface components that are most relevant to a specific media item and what is currently happening in connection with that media item. As such, the dynamic control system 102 provides viewers with a more engaging experience while browsing, selecting, and watching media items via the digital content system 104.
The following will provide, with reference to
Distribution infrastructure 910 generally represents any services, hardware, software, or other infrastructure components configured to deliver content to end users. For example, distribution infrastructure 910 includes content aggregation systems, media transcoding and packaging services, network components, and/or a variety of other types of hardware and software. In some cases, distribution infrastructure 910 is implemented as a highly complex distribution system, a single media server or device, or anything in between. In some examples, regardless of size or complexity, distribution infrastructure 910 includes at least one physical processor 912 and memory 914. One or more modules 916 are stored or loaded into memory 914 to enable adaptive streaming, as discussed herein.
Content player 920 generally represents any type or form of device or system capable of playing audio and/or video content that has been provided over distribution infrastructure 910. Examples of content player 920 include, without limitation, mobile phones, tablets, laptop computers, desktop computers, televisions, set-top boxes, digital media players, virtual reality headsets, augmented reality glasses, and/or any other type or form of device capable of rendering digital content. As with distribution infrastructure 910, content player 920 includes a physical processor 922, memory 924, and one or more modules 926. Some or all of the adaptive streaming processes described herein is performed or enabled by modules 926, and in some examples, modules 916 of distribution infrastructure 910 coordinate with modules 926 of content player 920 to provide adaptive streaming of digital content.
In certain embodiments, one or more of modules 916 and/or 926 in
In addition, one or more of the modules, processes, algorithms, or steps described herein transform data, physical devices, and/or representations of physical devices from one form to another. For example, one or more of the modules recited herein receive audio data to be encoded, transform the audio data by encoding it, output a result of the encoding for use in an adaptive audio bit-rate system, transmit the result of the transformation to a content player, and render the transformed data to an end user for consumption. Additionally or alternatively, one or more of the modules recited herein transform a processor, volatile memory, non-volatile memory, and/or any other portion of a physical computing device from one form to another by executing on the computing device, storing data on the computing device, and/or otherwise interacting with the computing device.
Physical processors 912 and 922 generally represent any type or form of hardware-implemented processing unit capable of interpreting and/or executing computer readable instructions. In one example, physical processors 912 and 922 access and/or modify one or more of modules 916 and 926, respectively. Additionally or alternatively, physical processors 912 and 922 execute one or more of modules 916 and 926 to facilitate adaptive streaming of digital content. Examples of physical processors 912 and 922 include, without limitation, microprocessors, microcontrollers, central processing units (CPUs), field-programmable gate arrays (FPGAs) that implement softcore processors, application-specific integrated circuits (ASICs), portions of one or more of the same, variations or combinations of one or more of the same, and/or any other suitable physical processor.
Memory 914 and 924 generally represent any type or form of volatile or non-volatile storage device or medium capable of storing data and/or computer-readable instructions. In one example, memory 914 and/or 924 stores, loads, and/or maintains one or more of modules 916 and 926. Examples of memory 914 and/or 924 include, without limitation, random access memory (RAM), read only memory (ROM), flash memory, hard disk drives (HDDs), solid-state drives (SSDs), optical disk drives, caches, variations or combinations of one or more of the same, and/or any other suitable memory device or system.
As shown, storage 1010 may store a variety of different items including content 1012, user data 1014, and/or log data 1016. Content 1012 includes television shows, movies, video games, user-generated content, and/or any other suitable type or form of content. User data 1014 includes personally identifiable information (PII), payment information, preference settings, language and accessibility settings, and/or any other information associated with a particular user or content player. Log data 1016 includes viewing history information, network throughput information, and/or any other metrics associated with a user's connection to or interactions with distribution infrastructure 910.
Services 1020 includes personalization services 1022, transcoding services 1024, and/or packaging services 1026. Personalization services 1022 personalize recommendations, content streams, and/or other aspects of a user's experience with distribution infrastructure 910. Transcoding services 1024 compress media at different bitrates which, as described in greater detail below, enable real-time switching between different encodings. Packaging services 1026 package encoded video before deploying it to a delivery network, such as network 1030, for streaming.
Network 1030 generally represents any medium or architecture capable of facilitating communication or data transfer. Network 1030 facilitates communication or data transfer using wireless and/or wired connections. Examples of network 1030 include, without limitation, an intranet, a wide area network (WAN), a local area network (LAN), a personal area network (PAN), the Internet, power line communications (PLC), a cellular network (e.g., a global system for mobile communications (GSM) network), portions of one or more of the same, variations or combinations of one or more of the same, and/or any other suitable network. For example, as shown in
As shown in
Communication infrastructure 1102 generally represents any type or form of infrastructure capable of facilitating communication between one or more components of a computing device. Examples of communication infrastructure 1102 include, without limitation, any type or form of communication bus (e.g., a peripheral component interconnect (PCI) bus, PCI Express (PCIe) bus, a memory bus, a frontside bus, an integrated drive electronics (IDE) bus, a control or register bus, a host bus, etc.).
As noted, memory 924 generally represents any type or form of volatile or non-volatile storage device or medium capable of storing data and/or other computer-readable instructions. In some examples, memory 924 stores and/or loads an operating system 1108 for execution by processor 922. In one example, operating system 1108 includes and/or represents software that manages computer hardware and software resources and/or provides common services to computer programs and/or applications on content player 920.
Operating system 1108 performs various system management functions, such as managing hardware components (e.g., graphics interface 1126, audio interface 1130, input interface 1134, and/or storage interface 1138). Operating system 1108 also provides process and memory management models for playback application 1110. The modules of playback application 1110 includes, for example, a content buffer 1112, an audio decoder 1118, and a video decoder 1120.
Playback application 1110 is configured to retrieve digital content via communication interface 1122 and play the digital content through graphics interface 1126 and audio interface 1130. Graphics interface 1126 is configured to transmit a rendered video signal to graphics device 1128. Audio interface 1130 is configured to transmit a rendered audio signal to audio device 1132. In normal operation, playback application 1110 receives a request from a user to play a specific title or specific content. Playback application 1110 then identifies one or more encoded video and audio streams associated with the requested title.
In one embodiment, playback application 1110 begins downloading the content associated with the requested title by downloading sequence data encoded to the lowest audio and/or video playback bitrates to minimize startup time for playback. The requested digital content file is then downloaded into content buffer 1112, which is configured to serve as a first-in, first-out queue. In one embodiment, each unit of downloaded data includes a unit of video data or a unit of audio data. As units of video data associated with the requested digital content file are downloaded to the content player 920, the units of video data are pushed into the content buffer 1112. Similarly, as units of audio data associated with the requested digital content file are downloaded to the content player 920, the units of audio data are pushed into the content buffer 1112. In one embodiment, the units of video data are stored in video buffer 1116 within content buffer 1112 and the units of audio data are stored in audio buffer 1114 of content buffer 1112.
A video decoder 1120 reads units of video data from video buffer 1116 and outputs the units of video data in a sequence of video frames corresponding in duration to the fixed span of playback time. Reading a unit of video data from video buffer 1116 effectively de-queues the unit of video data from video buffer 1116. The sequence of video frames is then rendered by graphics interface 1126 and transmitted to graphics device 1128 to be displayed to a user.
An audio decoder 1118 reads units of audio data from audio buffer 1114 and outputs the units of audio data as a sequence of audio samples, generally synchronized in time with a sequence of decoded video frames. In one embodiment, the sequence of audio samples is transmitted to audio interface 1130, which converts the sequence of audio samples into an electrical audio signal. The electrical audio signal is then transmitted to a speaker of audio device 1132, which, in response, generates an acoustic output.
In situations where the bandwidth of distribution infrastructure 910 is limited and/or variable, playback application 1110 downloads and buffers consecutive portions of video data and/or audio data from video encodings with different bit rates based on a variety of factors (e.g., scene complexity, audio complexity, network bandwidth, device capabilities, etc.). In some embodiments, video playback quality is prioritized over audio playback quality. Audio playback and video playback quality are also balanced with each other, and in some embodiments audio playback quality is prioritized over video playback quality.
Graphics interface 1126 is configured to generate frames of video data and transmit the frames of video data to graphics device 1128. In one embodiment, graphics interface 1126 is included as part of an integrated circuit, along with processor 922. Alternatively, graphics interface 1126 is configured as a hardware accelerator that is distinct from (i.e., is not integrated within) a chipset that includes processor 922.
Graphics interface 1126 generally represents any type or form of device configured to forward images for display on graphics device 1128. For example, graphics device 1128 is fabricated using liquid crystal display (LCD) technology, cathode-ray technology, and light emitting diode (LED) display technology (either organic or inorganic). In some embodiments, graphics device 1128 also includes a virtual reality display and/or an augmented reality display. Graphics device 1128 includes any technically feasible means for generating an image for display. In other words, graphics device 1128 generally represents any type or form of device capable of visually displaying information forwarded by graphics interface 1126.
As illustrated in
Content player 920 also includes a storage device 1140 coupled to communication infrastructure 1102 via a storage interface 1138. Storage device 1140 generally represents any type or form of storage device or medium capable of storing data and/or other computer readable instructions. For example, storage device 1140 is a magnetic disk drive, a solid-state drive, an optical disk drive, a flash drive, or the like. Storage interface 1138 generally represents any type or form of interface or device for transferring data between storage device 1140 and other components of content player 920.
Example 1: A computer-implemented method for dynamically modifying components of a playback control graphical user interface based on a context of a media item. For example, the method may include pairing a first screen device displaying a media item and a second screen device acting as a controller relative to the media item, determining a context associated with the media item, generating, based on the context, instructions to modify one or more components of a playback control graphical user interface displayed by the second screen device, and transmitting the instructions to the second screen device to trigger modification of the one or more components of the playback control graphical user interface.
Example 2: The computer-implemented method of Example 1, wherein pairing the first screen device displaying the media item and the second screen device acting as the controller includes receiving a device subscription request from the first screen device, generating a device subscription list including the first screen device, receiving a device subscription request from the second screen device, adding the second screen device to the device subscription list, transmitting the device subscription list to the first screen device, in response to receiving a request from the first screen device to pair with the second screen device, associating the first screen device and the second screen device.
Example 3: The computer-implemented method of any of Examples 1 and 2, further including, in response to associating the first screen device and the second screen device, receiving one or more messages from the first screen device, and pushing the one or more messages to the second screen device.
Example 4: The computer-implemented method of any of Examples 1-3, wherein pairing the first screen device and the second screen device is in response to a detected selection of the first screen device from a pairing option list displayed by the second screen device.
Example 5: The computer-implemented method of any of Examples 1-4, further including, in response to pairing the first screen device and the second screen device, causing the second screen device to display an indication of successful pairing within the playback control graphical user interface displayed by the second screen device, and causing the first screen device to display the indication of successful pairing within a media item graphical user interface displayed by the first screen device.
Example 6: The computer-implemented method of any of Examples 1-5, further including generating instructions to display the playback control graphical user interface on the second screen device, wherein the instructions to display the playback control graphical user interface are based on whether the media item is a standalone media item or an episodic media item.
Example 7: The computer-implemented method of any of Examples 1-6, wherein determining the context associated with the media item includes at least one of determining that playback of the media item has been initiated, determining that playback of the media item has entered an introduction portion of the media item, determining that the media item is a last episode in a collection of episodes, or determining that a subtitle selection associated with the media item has been made.
Example 8: The computer-implemented method of any of Examples 1-7, further including detecting one or more user selections of components within the playback control graphical user interface on the second screen device, and communicating the detected one or more selections to the first screen device.
Example 9: The computer-implemented method of any of Examples 1-8, wherein generating instructions to modify one or more components of the playback control graphical user interface includes one or more of generating instructions to provide a new component within the playback control graphical user interface, generating instructions to highlight an existing component within the playback control graphical user interface, generating instructions to modify an existing component within the playback control graphical user interface, or generating instructions to update a position of a playback scrubber within the playback control graphical user interface.
In some examples, a system may include at least one processor and a physical memory including computer-executable instructions that, when executed by the at least one processor, cause the at least one processor to perform various acts. For example, the computer executable instructions may cause the at least one processor to perform acts including pairing a first screen device displaying a media item and a second screen device acting as a controller relative to the media item, determining a context associated with the media item, generating, based on the context, instructions to modify one or more components of a playback control graphical user interface displayed by the second screen device, and transmitting the instructions to the second screen device to trigger modification of the one or more components of the playback control graphical user interface.
Additionally in some examples, a method may include receiving, at a second screen device via digital content system application, instructions to generate a control graphical user interface associated with a media item displayed on a first screen device, transmitting, in response to a detected selection of a pause/play button within the control graphical user interface, an electronic communication that initiates playback of the media item, receiving, at the second screen device via the digital content system application, instructions that trigger one or more modifications of the control graphical user interface based on playback of the media item entering an introduction portion of the media item, and modifying the control graphical user interface based on the received instructions. In one example, the method further includes receiving, at the second screen device via the digital content system application, additional instructions that trigger additional modifications of the control graphical user interface based on playback of the media item moving past the introduction portion of the media item, and modifying the control graphical user interface based on the received additional instructions.
Unless otherwise noted, the terms “connected to” and “coupled to” (and their derivatives), as used in the specification and claims, are to be construed as permitting both direct and indirect (i.e., via other elements or components) connection. In addition, the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.” Finally, for ease of use, the terms “including” and “having” (and their derivatives), as used in the specification and claims, are interchangeable with and have the same meaning as the word “comprising.”