The present disclosure relates generally to media presentation systems and, more particularly, to visual indicators associated with a media presentation system.
Media presentation systems often include a user interface to assist a user in utilizing the various services (e.g., an on-demand service) and/or content (e.g., television programming or music channels) of a media delivery system (e.g., a cable or satellite delivery system). Such a user interface may be implemented via on-screen graphics (e.g., menus, lists, etc.) that may be sorted through or manipulated. During utilization of the user interface or the media presentation system in general, the user may engage a button or select an option that causes the system to inform the user that an action cannot be taken or that the requested action was successfully performed. This may cause the media presentation system to produce a sound warning the user that the action was performed, cannot be performed, or is unavailable when the system or user interface is in a certain condition or state. For example, when the end of a menu is reached and a cursor can no longer be scrolled, the media presentation system may produce an audible ‘bonk’ or ‘beep,’ thereby alerting the user that the desired action is unavailable.
The example methods and apparatus to display visual indicators associated with a media presentation system (e.g., a home entertainment system including a media signal decoder and a television) described herein may be implemented in connection with any type of media broadcasting system including, for example, satellite broadcast systems, cable broadcast systems, radio frequency wave broadcast systems, etc. By way of illustration, an example broadcast system is described below in connection with
As illustrated in
In further detail, the example transmission station 102 of the example system of
To facilitate the broadcast of information, the encoded information passes from the encoder 116 to an uplink frequency converter 118 that modulates a carrier wave with the encoded information and passes the modulated carrier wave to an uplink antenna 120, which broadcasts the information to the satellite/relay 104. Using any of a variety of techniques, the encoded bitstream is modulated and sent through the uplink frequency converter 118, which converts the modulated encoded bitstream to a frequency band suitable for reception by the satellite/relay 104. The modulated, encoded bitstream is then routed from the uplink frequency converter 118 to the uplink antenna 120 where it is broadcast toward the satellite/relay 104.
The programming sources 108 receive video and audio programming from a number of sources, including satellites, terrestrial fiber optics, cable, or tape. The video and audio programming may include, but is not limited to, television programming, movies, sporting events, news, music or any other desirable content.
Like the programming sources 108, the control data source 110 passes control data to the encoder 116. Control data may include data representative of a list of SCIDs to be used during the encoding process, or any other suitable information.
The data service source 112 receives data service information and web pages made up of text files, graphics, audio, video, software, etc. Such information may be provided via a network 122. In practice, the network 122 may be the Internet, a local area network (LAN), a wide area network (WAN) or a conventional public switched telephone network (PSTN). The information received from various sources is compiled by the data service source 112 and provided to the encoder 116. For example, the data service source 112 may request and receive information from one or more websites 124. The information from the websites 124 may be related to the program information provided to the encoder 116 by the program sources 108, thereby providing additional data related to programming content that may be displayed to a user at the receiver station 106.
The program guide data source 114 compiles information related to the SCIDs used by the encoder 116 to encode the data that is broadcast. For example, the program guide data source 114 includes information that the receiver stations 106 use to generate and display a program guide to a user, wherein the program guide may be a grid guide that informs the user of particular programs that are available on particular channels at particular times. The program guide also includes information that the receiver stations 106 use to assemble programming for display to the user. For example, if the user desires to watch a baseball game on his or her receiver station 106, the user will tune to a channel on which the game is offered. The receiver station 106 gathers the SCIDs related to the game, wherein the program guide data source 114 has previously provided to the receiver station 106 a list of SCIDs that correspond to the game. Such a program guide may be manipulated via an input device (e.g., a remote control). For example, a cursor may be moved to highlight a program description within the guide. A user may then select a highlighted program description via the input device to navigate to associated content (e.g., an information screen containing a summary of a television show episode) or active an interactive feature (e.g., a program information screen, a recording process, a future showing list, etc.) associated with an entry of the program guide.
The on-demand (OD) source 115 receives data from a plurality of sources, including, for example, television broadcasting networks, cable networks, system administrators (e.g., providers of the DTH system 100), or other content distributors. Such content may include television programs, sporting events, movies, music, and corresponding information (e.g., user interface information for OD content) for each program or event. The content may be stored (e.g., on a server) at the transmission station 102 or locally (e.g., at a receiver station 106), and may be updated to include, for example, new episodes of television programs, recently released movies, and/or current advertisements for such content. Via a user interface, which also may be updated periodically, a user (e.g., a person with a subscription to an OD service) may request (i.e., demand) programming from the OD source 115. The system 100 may then stream the requested content to the user (e.g., over the satellite/relay 104 or the network 122) or make it available for download and storage (discussed further below in connection with
The satellite/relay 104 receives the modulated, encoded Ku-band bitstream and re-broadcasts it downward toward an area on earth that includes the receiver station 106. In the illustrated example of
The receiver station 106 may also incorporate a connection 136 (e.g., Ethernet circuit or modem for communicating over the Internet) to the network 122 for transmitting requests and other data back to the transmission station 102 (or a device managing the transmission station 102 and overall flow of data in the example system 100) and for communicating with websites 124 to obtain information therefrom.
In operation of the receiver station 106, the reception antenna 126 receives signals including a bitstream from the satellite/relay 104. The signals are coupled from the reception antenna 126 to the LNB 128, which amplifies and, optionally, downconverts the received signals. The LNB output is then provided to the IRD 130.
As illustrated in
To communicate with any of a variety of clients, media players, etc., the example IRD 130 includes one or more digital interfaces 230 (e.g., USB, serial port, Firewire, etc.). To communicatively couple the example IRD 130 to, for instance, the Internet and/or a home network, the example IRD 130 includes a network interface 235 that implements, for example, an Ethernet interface.
Further, the example IRD 130 includes an example event controller 240 to monitor and/or respond to events occurring in, for example, a user interface (e.g., a plurality of interacting on-screen menus, lists, queues, etc. to be manipulated via a remote control) or the system 100 in general. Specifically, the example event controller 240 monitors events associated with an aspect of the media presentation system (e.g., features of the user interface), determines what response or what type of response the events invoke, and, in some examples, causes the media presentation system to audibly and/or visually inform a user of a condition (e.g., successful action taken or action unavailable) or a state of an element (e.g., the user interface) of the media presentation system. Events may be internal (e.g., requests to store a program in memory that may or may not have sufficient free space) or external (e.g., failed transfers of data between the transmission station 102 and the receiver station 106) of the IRD 130 and may include a selection of a feature or option (e.g., a scheduling of a recording, scrolling through a menu, selecting a channel for tuning, etc.), activation or deactivation of the user interface, changing of modes, a request for information, a systematic error, a completion of a download, a system confirmation or notification, etc. Further, the characteristics, aspects, or general operation of the responses (e.g., visual indicators) to the events, as generated by the event controller 240, may be dependent on an operational state or settings of the IRD 130, the display device 220, or other component of the media presentation system.
As described below in connection with
Although the following discloses example processes through the use of flow diagrams having blocks, it should be noted that these processes may be implemented in any suitable manner. For example, the processes may be implemented using, among other components, software, or firmware executed on hardware. However, this is merely one example and it is contemplated that any form of logic may be used to implement the systems or subsystems disclosed herein. Logic may include, for example, implementations that are made exclusively in dedicated hardware (e.g., circuits, transistors, logic gates, hard-coded processors, programmable array logic (PAL), application-specific integrated circuits (ASICs), etc.), exclusively in software, exclusively in firmware, or some combination of hardware, firmware, and/or software. For example, instructions representing some or all of the blocks shown in the flow diagrams may be stored in one or more memories or other machine readable media, such as hard drives or the like (e.g., the memories 806 and/or 808 of
The process 300 starts with an activation of a media presentation system (e.g., the system 100 of
A module or device (e.g., the event controller 240 of
The engagement of these options may cause an action in the user interface (e.g., a scrolling or jumping) and a corresponding notification (e.g., an audio or visual effect) when the requested action is available or can be accommodated. In other examples, where the action requested by the event cannot be accommodated and/or is unavailable in the current state or condition of the media presentation system, the user may be notified (audibly and/or visually) of the unavailability of the requested option or the inability of the system to perform the requested action. In other words, depending on a condition (e.g., a current position of a cursor in a menu, a mode of the user interface, recorded content being played back, the presence of new messages in a mailbox, transitioning of states, experiencing a system error, etc.) of the media presentation system or associated user interface, the action requested by the event may or may not be performed. Further, the audience or user may be notified (e.g., via audible and/or visual indicators) as to whether the requested action may be performed.
In the example process 300 of
If the event does not invoke an indicator, the system may perform the action requested by the event and the process 300 may continue to monitor the system for events (block 304). Alternatively, if the detected event does invoke an indicator, the process 300 may then determine whether a visual indicator should be displayed. The visualization feature may be activated manually by the user (e.g., via a setting in the user interface) (block 310). For example, the visualization feature may be manually activated when one or more users are hearing-impaired and, thus, unable to hear (i.e., audibly receive) an audible indication of a condition of the media presentation system (e.g., a switching of modes from live broadcast viewing to an on-demand mode). Such a hearing-impaired user may manually set the visualization feature, causing the visualization feature to remain active despite any powering off of the media presentation system or a component thereof. In some examples, the visualization feature may be activated in a loud place (e.g., a restaurant or bar) that may include one or more displays (e.g., television sets). In some examples, the user may manually active the visualization feature due to a preference of the visual indicators, regardless of any inability to hear an audio indicator. Where the process 300 determines that such a manual setting is active, one or more visual indicators (e.g., the visual indicators 402, 502, 602, and 702 of
The process 300 may also determine whether an audible indicator can be audibly received by one or more users (block 314). If the audio indicators cannot be audibly received, the associated visual indicators are displayed (block 312). For examples, the system may be muted for any variety or reasons and thus unable to generate an audible effect (e.g., a ‘bonk’ sound to indicate that a hard disk is full). In some examples, a closed-captioning function is activated when the mute function is active or may be activated independent of the mute function. In some examples, the process 300 may determine whether the volume level is set below a threshold value (e.g., a preset value or a setting that may be adjusted by a user), thereby restricting the ability of the user to audibly receive an audio indicator. Interacting with a system in any of these situations (or any other situation in which the user is unable to receive an indication produced by the system) may prove difficult without any useful indicators to inform the user of, for example, an unavailability or inability to perform a requested action. These and other problems may be alleviated by the visual indicators described herein.
The example visual indicator 402 of
Other visual indicators may include alternative texts or graphics (e.g., various colors, words, shapes, sizes, etc.) depending on, for example, the basis on which the visual indicator is displayed. For example, while the visual indicator 402 may be displayed when, for example, a menu cannot scroll as requested, the example visual indicator 502 of
The example visual indicator 602 of
In some examples, a visual indicator may be assigned a shape and/or location similar to an existing element of the user interface. For example, as shown in
Further, the visual indicators may include such varying characteristics (e.g., color, shape, etc.) based on the condition of the media presentation system and/or user interface that causes the example visual indicators to be displayed. For example, the visual indicators may be yellow when the user interface is changing screens (e.g., transitioning from a program guide to an information screen), red when a requested action is unavailable (e.g., a menu cannot be scrolled down any further), green when an action is taken successfully, etc. Additionally and/or alternatively, the characteristics of the visual indicators may depend on the currently displayed content (e.g., a currently tuned channel or playback of recorded content). For example, the provider of the media presentation system may assign different characteristics to the visual indicators based on the type of content (e.g., comedy, drama, sports, etc.) or channel being viewed by the user. In some examples, such characteristics may include a graphic associated with a genre (e.g., a football graphic for sports) or a content provider (e.g., a logo of a broadcast channel). Similarly, the contents of the embedded text may depend on the condition of the media presentation system and/or the user interface.
Additionally and/or alternatively, the visual indicators may be presented for variable, static, repeating, or dynamic durations. The duration of display may depend on, for example, a repetition of a request, a default setting, or the condition of the system and/or the user interface. Further, similar to the other aspects of the visual indicators described herein, the duration of display may be customizable via a setting in the user interface.
The methods and apparatus described herein may be designed by, for example, a content delivery system (DIRECTV®) programmer and/or a content provider (e.g., a broadcasting company). Where the visual indicators (e.g., graphics containing text corresponding to an audio indicator) are designed and/or added to elements or portions of a program guide by a content provider (e.g., the National Broadcasting Company), the content delivery system programmer may make adjustments to tailor the visual indicators to comply with system parameters (e.g., size or shape of a graphic).
The processor 802 may be coupled to an interface, such as a bus 810 to which other components may be interfaced. The example RAM 806 may be implemented by dynamic random access memory (DRAM), Synchronous DRAM (SDRAM), and/or any other type of RAM device, and the example ROM 808 may be implemented by flash memory and/or any other desired type of memory device. Access to the example memories 808 and 806 may be controlled by a memory controller (not shown) in a conventional manner.
To send and/or receive system inputs and/or outputs, the example processor unit 800 includes any variety of conventional interface circuitry such as, for example, an external bus interface 812. For example, the external bus interface 812 may provide one input signal path (e.g., a semiconductor package pin) for each system input. Additionally or alternatively, the external bus interface 812 may implement any variety of time multiplexed interface to receive output signals via fewer input signals.
To allow the example processor unit 800 to interact with a remote server, the example processor unit 800 may include any variety of network interfaces 818 such as, for example, an Ethernet card, a wireless network card, a modem, or any other network interface suitable to connect the processor unit 800 to a network. The network to which the processor unit 800 is connected may be, for example, a local area network (LAN), a wide area network (WAN), the Internet, or any other network. For example, the network could be a home network, an intranet located in a place of business, a closed network linking various locations of a business, or the Internet.
Although an example processor unit 800 has been illustrated in
The apparatus and methods described above are non-limiting examples. Although the example apparatus and methods described herein include, among other components, software executed on hardware, such apparatus and methods are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of the disclosed hardware and software components could be embodied exclusively in dedicated hardware, exclusively in software, exclusively in firmware or in some combination of hardware, firmware, and/or software.
Further, although certain example methods and apparatus have been described herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods and apparatus fairly falling within the scope of the appended claims either literally or under the doctrine of equivalents.