The present disclosure relates generally to media presentation systems and, more particularly, to user interfaces to present shared media.
Advancements in communication technology have led to enhanced media players (e.g., personal computers, digital video recorders, home media centers, game playing systems, etc.) and content delivery systems (e.g., broadband, satellite, digital cable, Internet, etc.). For example, every improvement in processing capability allows developers to provide additional functionality to a system. Such advancement also enables a single device or system to integrate control over several functions or operations that were previously performed by multiple devices or systems. The user interfaces that accompany these systems are also evolving.
Although the example apparatus and methods described herein include, among other components, software executed on hardware, such apparatus and methods are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of the disclosed hardware and software components could be embodied exclusively in dedicated hardware, exclusively in software, exclusively in firmware or in some combination of hardware, firmware, and/or software.
Media may take many different forms such as audio, video, and/or photos or images. Media may also include one or more combinations of one or more types of media. For example, media may include one or more images or photos presented jointly with audio content. Another example may include video presented with audio (e.g., audio corresponding to the video content or separate audio played over the video content). In other words, media may include any form of audio and/or visual presentation including, for example, programming or programming content (e.g., a television program or broadcast). The example methods and apparatus described herein may be used to present media that may, for example, be stored on a media storage device in a media presentation system such as, for example, a home entertainment system including a media signal decoder (e.g., a set-top-box, a receiver, etc.) and a television, an audio system, or other media presentation device (e.g., a computer monitor and/or computer speakers). Moreover, the example interfaces described herein may be implemented to facilitate an interaction between such a media presentation system and a peripheral media storage device (e.g., a memory of a networked computer within a home) to present the contents of the peripheral device via the media presentation system (e.g., a television coupled to a set-top box).
The example methods, apparatus, and interfaces described herein to present media may be implemented in connection with any type of media transmission system including, for example, satellite broadcast systems, cable broadcast systems, radio frequency wave broadcast systems, broadband transmission systems, etc. By way of illustration, an example broadcast system is described below in connection with
As illustrated in
In further detail, the example transmission station 102 of the example system of
To facilitate the broadcast of information such as media, the encoded information passes from the encoder 116 to an uplink frequency converter 118 that modulates a carrier wave with the encoded information and passes the modulated carrier wave to an uplink antenna 120, which broadcasts the information to the satellite/relay 104. Using any of a variety of techniques, the encoded bitstream is modulated and sent through the uplink frequency converter 118, which converts the modulated encoded bitstream to a frequency band suitable for reception by the satellite/relay 104. The modulated, encoded bitstream is then routed from the uplink frequency converter 118 to the uplink antenna 120 where it is broadcast toward the satellite/relay 104.
The satellite/relay 104 receives the modulated, encoded Ku-band bitstream and re-broadcasts it downward toward an area on earth that includes the receiver station 106. In the illustrated example of
In operation of the receiver station 106, the reception antenna 126 receives signals including a bitstream from the satellite/relay 104. The signals are coupled from the reception antenna 126 to the LNB 128, which amplifies and, optionally, downconverts the received signals. The LNB output is then provided to the IRD 130.
The receiver station 106 may also incorporate a connection 136 (e.g., Ethernet circuit or modem for communicating over the Internet) to the network 122 for transmitting requests for information and/or media and/or other data back to and from the transmission station 102 (or a device managing the transmission station 102 and overall flow of data in the example system 100) and for communicating with websites 124 to obtain information therefrom. For example, as discussed further below, the IRD 130 may acquire and decode on-demand content and/or information associated with on-demand content from the on-demand source 115 via the connection 136 (e.g., a broadband Internet connection). Further, the IRD 130 may coupled to an external media storage device 132 (e.g., a hard drive of a personal computer in a home along with the IRD 130 or a computer connected over a network or other communication means). As described below, media files (e.g., music or images) stored on the media storage device 132 may be shared with the IRD 130 and presented on a display device (e.g., the display device 220 of
The programming sources 108 receive video and/or audio programming (e.g., various forms of media) from a number of sources, including satellites, terrestrial fiber optics, cable, or tape. The programming may include, but is not limited to, television programming, movies, sporting events, news, music or any other desirable content. Like the programming sources 108, the control data source 110 passes control data to the encoder 116. Control data may include data representative of a list of SCIDs to be used during the encoding process, or any other suitable information.
The data service source 112 receives data service information and web pages made up of text files, graphics, audio, video, software, etc. Such information may be provided via a network 122. In practice, the network 122 may be the Internet, a local area network (LAN), a wide area network (WAN) or a conventional public switched telephone network (PSTN). The information received from various sources is compiled by the data service source 112 and provided to the encoder 116. For example, the data service source 112 may request and receive information from one or more websites 124. The information from the websites 124 may be related to the program information provided to the encoder 116 by the program sources 108, thereby providing additional data related to programming content that may be displayed to a user at the receiver station 106.
The program guide data source 114 compiles information related to the SCIDs used by the encoder 116 to encode the data that is broadcast. For example, the program guide data source 114 includes information that the receiver stations 106 use to generate and display a program guide to a user, wherein the program guide may be configured as a grid that informs the user of particular programs that are available on particular channels at particular times. Such a program guide may also include information that the receiver stations 106 use to assemble programming for display to the user. For example, if the user desires to watch media such as a baseball game on his or her receiver station 106, the user will tune to a channel on which the game is offered. The receiver station 106 gathers the SCIDs related to the game, wherein the program guide data source 114 has previously provided to the receiver station 106 a list of SCIDs that correspond to the game. Such a program guide may be manipulated via an input device (e.g., an remote control). For example, a cursor may be moved to highlight a program description within the guide. A user may then select a highlighted program description via the input device to navigate to associated content (e.g., an information screen containing a summary of a television program).
The on-demand (OD) source 115 receives data representative of content or media from a plurality of sources, including, for example, television broadcasting networks, cable networks, system administrators (e.g., providers of the DTH system 100), or other content distributors. Such content (e.g., media) may include television programs, sporting events, movies, music, and corresponding information (e.g., user interface information for OD content) for each program or event. The content may be stored (e.g., on a server) at the transmission station 102 or locally (e.g., at a receiver station 106), and may be updated to include, for example, new episodes of television programs, recently released movies, and/or current advertisements for such content. Via a user interface, which also may be updated periodically to reflect current time or offerings, a user (e.g., a person with a subscription to an OD service) may request (i.e., demand) programming from the OD source 115. The system 100 may then stream the requested content to the user (e.g., over a broadband Internet connection) or make it available for download and storage. Thus, an OD service allows a user to view, download, and/or record selected programming at any time. While the acquisition of such content may involve a delay, the term ‘on-demand’ generally refers to a service that allows a user to request and subsequently receive media content. In other words, while on-demand content may not be immediately available, it includes content that may be requested for transmission (e.g., over a broadband Internet connection or via a satellite), download, and/or storage.
As illustrated in
To communicate with any of a variety of clients, media players, media storage devices, etc., the example IRD 130 includes one or more digital interfaces 230 (e.g., USB, serial port, Firewire, etc.). To communicatively couple the example IRD 130 to, for example, the Internet and/or a home network, the example IRD 130 includes a network interface 235 that implements, for example, an Ethernet interface.
The example IRD 130 is only one example implementation of a device that may be used to carry out the functionality described herein. Similar systems may include additional or alternative components (e.g., decoders, encoders, converters, graphics accelerators, etc.).
Having described the architecture of one example system that may be used to implement a user interface to present shared media, an example process for performing the same is described below. Although the following discloses an example process through the use of a flow diagram having blocks, it should be noted that the process may be implemented in any suitable manner. For example, the processes may be implemented using, among other components, software, or firmware executed on hardware. However, this is merely one example and it is contemplated that any form of logic may be used to implement the systems or subsystems disclosed herein. Logic may include, for example, implementations that are made exclusively in dedicated hardware (e.g., circuits, transistors, logic gates, hard-coded processors, programmable array logic (PAL), application-specific integrated circuits (ASICs), etc.) exclusively in software, exclusively in firmware, or some combination of hardware, firmware, and/or software. For example, instructions representing some or all of the blocks shown in the flow diagrams may be stored in one or more memories or other machine readable media, such as hard drives or the like. Such instructions may be hard coded or may be alterable. Additionally, some portions of the process may be carried out manually. Furthermore, while each of the processes described herein is shown in a particular order, such an ordering is merely one example and numerous other orders exist.
As described above, a user interface may be provided to facilitate an interaction between a user and a media presentation system. For example, to allow the utilization or navigation of the content stored on a media storage device (e.g., the media storage device 132 of
As indicated by the segments (i.e., the ‘Music,’ ‘Photos,’ ‘Videos,’ and ‘My Computers’ categories) of the menu 304, the example user interface 300 allows a user to navigate through and access media such as music, image, and/or video content from, for example, one or more computers. Other example user interfaces may include additional options to access further types of media. As described below, categories or other values may be selected from the menu 304 to facilitate navigation through the menu 304 and to alter the contents of the menu 304 itself and/or the contents of the staging section 312.
The information section 306 may include information, questions, and/or instructions regarding the use of the media file presentation system. For example, the information section 306 may prompt a user to select a song from a list (as illustrated in
The display section 310 may include, for example, a display of the channel to which the system is currently tuned, or may include a recording currently being played back, a music file currently being played, a slideshow of images currently being presented, and/or a playback of a video from a peripheral media storage device. Additionally or alternatively, the display section 310 may include media information such as information related to a currently tuned channel or programming content. The display section 310 allows a user to continue to view and/or listen to media while navigating through the user interface 300. For example, if a user is viewing a live television broadcast, the display section 310 may display the broadcast while a user parses through a list of songs stored on a media storage device (e.g., the media storage device 132 of
The staging section 312 may be responsive to user selections made in the menu 304 and, as illustrated in the following figures, provides a display of available content from a peripheral (i.e., in relation to the media presentation system) media storage device (e.g., a computer coupled to a set-top box). The staging section 312 may include a textual or graphical representation of the contents of such a media storage device. A cursor may be maneuvered via an input device (e.g., an infrared or radio frequency (RF) remote control) over the contents of the staging section 312 to select a media file for presentation or to view information regarding the file. A selection of a file (e.g., via a textual or pictorial graphic associated with the file) may cause the media presentation system to exit the user interface 300 and return to a main display (e.g., a full-screen presentation mode), where the selected media file may be presented (e.g., a music file may be played or an image or video may be displayed).
The process 400 may determine which, if any, option from the menu 304 was selected. A selection of a ‘Music’ option 314 (block 406), for example, may cause the process 400 to display a set of music categories in the menu 304 (block 408).
Further, a ‘Shuffle All’ option 504 may be included in the menu 304. The selection of the ‘Shuffle All’ option 504 (block 410) may prompt the playing of a randomly chosen song or a continuous string of randomly chosen songs (block 412). On the other hand, when a category is chosen (block 414), the contents of the category are displaying in the staging section 312 (block 416).
Additionally, each entry of the list 602 may have multiple layers or subcategories into which the music data may be organized or sorted. The layers or subcategories may be displayed in a staggered arrangement (i.e., to reflect the organization of the content on the media storage device) upon the selection of an entry of the list 602. For example,
Returning to the flowchart of
A selection of a ‘Photos’ option 316 (block 422) from the main page 302 may cause the process 400 to display a set of options in the menu 304 (block 424). FIG. 9 shows a screenshot 900 of the user interface 300 when a user has selected a ‘Photos’ option 316 from the main page 302. Specifically, in this example, the menu 304 includes a ‘Shuffle All’ option 902 and a ‘Browse’ option 904. When the ‘Shuffle All’ option 902 is selected (block 426) a random photo or a string of continuous photos (i.e., a slideshow) from a media storage device (e.g., the media storage device 132 of
On the other hand, when the ‘Browse’ option 904 is selected (block 430), one or more photograph categories may be listed in the menu 304 and/or one or more photographs (or graphics linked to the photographs) may be displayed in the staging section 312 (block 432).
A selection of a ‘My Computers’ option 318 (block 442) from the main page 302 may cause the process 400 to display a list (not shown) of available media sources (block 444) in the menu 304 or the staging section 312. When a media source is selected, the process 400 may access the selected source (e.g., to prepare the user interface 300 with the contents of the selected media source) (block 446). The process 400 may then return the user interface 300 to the main page 302 (block 404).
A selection of a ‘Videos’ option 318 (block 448) from the main page 302 may cause the process 400 to display a set of options in the menu 304 (block 450).
On the other hand, when the ‘Browse’ option 1104 is selected (block 456), one or more video categories may be listed in the menu 304 and/or one or more images (i.e., graphics linked to the video) may be displayed in the staging section 312 (block 458).
The example process 400 described above is one possible implementation of the example user interface 300. The process 400 and the user interface 300 may include additional and/or alternative features or aspects to facilitate an interaction between a user and a media presentation system to present shared media. Further, while the example user interface 300 and the example process 400 include features to present media such as music, video, and images (e.g., mp3 files, digital images, etc.), other types of media may be included in other example user interfaces and/or processes.
Additionally, the example user interfaces described herein may facilitate a paired media feature. A paired media feature may allow a media presentation system to present, for example, a slideshow selected from a media storage device (e.g., the media storage device 132 of
While multiple types of media (e.g., music, video, and/or images) are being presented, the example user interfaces and methods described herein may allow a user to control either type of media with a single user interface and/or a single set of control keys (e.g., buttons on a remote input device). For example, an input device may include a single set of playback keys (e.g., play, pause, fast-forward, etc.) that may be used for any type of media via a control switching button on the input device. Accordingly, when music and images, for example, are simultaneously being presented, the set of input keys may be set to control the images (e.g., a slideshow that may be paused, fast-forwarded, reversed, etc.) and an engagement of the control switching button may cause the same set of input keys to be set to control the music (e.g., a song that may be paused, fast-forwarded, reversed, etc.). Further, as described above, the same user interface (e.g., the user interface 300 of
The user interface 300 may also implement a ‘Now Playing’ feature that allows a user to return to the context (i.e., the state of the user interface 300) from which the currently playing media was chosen. For example, where a user had selected a song from the list 802 of
The processor 1302 may be coupled to an interface, such as a bus 1310 to which other components may be interfaced. The example RAM 1306 may be implemented by dynamic random access memory (DRAM), Synchronous DRAM (SDRAM), and/or any other type of RAM device, and the example ROM 1308 may be implemented by flash memory and/or any other desired type of memory device. Access to the example memories 1308 and 1306 may be controlled by a memory controller (not shown) in a conventional manner.
To send and/or receive system inputs and/or outputs, the example processor unit 1300 includes any variety of conventional interface circuitry such as, for example, an external bus interface 1312. For example, the external bus interface 1312 may provide one input signal path (e.g., a semiconductor package pin) for each system input. Additionally or alternatively, the external bus interface 1312 may implement any variety of time multiplexed interface to receive output signals via fewer input signals.
To allow the example processor unit 1300 to interact with a remote server, the example processor unit 1300 may include any variety of network interfaces 1318 such as, for example, an Ethernet card, a wireless network card, a modem, or any other network interface suitable to connect the processor unit 1300 to a network. The network to which the processor unit 1300 is connected may be, for example, a local area network (LAN), a wide area network (WAN), the Internet, or any other network. For example, the network could be a home network, an intranet located in a place of business, a closed network linking various locations of a business, or the Internet.
Although an example processor unit 1300 has been illustrated in
The apparatus and methods described above are non-limiting examples. Although the example apparatus and methods described herein include, among other components, software executed on hardware, such apparatus and methods are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of the disclosed hardware and software components could be embodied exclusively in dedicated hardware, exclusively in software, exclusively in firmware or in some combination of hardware, firmware, and/or software.
Although certain example methods and apparatus have been described herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods and apparatus fairly falling within the scope of the appended claims either literally or under the doctrine of equivalents.