The present invention generally relates to user interfaces in television receiver devices, and more particularly relates to systems and methods for providing customer service features a television receiver device.
Most television viewers now receive their television signals through a content aggregator such as a cable or satellite television provider. For subscribers to a direct broadcast satellite (DBS) service, for example, television programming is received via a broadcast that is sent via a satellite to an antenna that is generally located on the exterior of a home or other structure. Other customers receive television programming through conventional television broadcasts, or through cable, wireless or other media. Programming is typically received at a receiver such as a “set top box” (STB) or other receiver that demodulates the received signals and converts the demodulated content into a format that can be presented to the viewer on a television or other display. In addition to receiving and demodulating television programming, many television receivers are able to provide additional features. Examples of features available in many modern television receivers include electronic program guides (EPGs), digital or other personal video recorders, “place-shifting” features for streaming received content over a network or other medium, providing customer service information and/or the like.
While some set-top box type receivers have provided limited customer service features, in general, most viewers still prefer to contact a customer service center via telephone rather than use the box itself for even routine billing or service queries. This is at least partly due to limitations in the conventional interfaces provided by most television receiver devices.
While conventional interfaces are useful for many purposes, there remains a continual desire for more efficient and intuitive user interfaces to the various features provided by the receiver. In particular, there is a desire to provide convenient access to customer service features using more advanced interface features. It is therefore desirable to create systems and methods for improving the viewer interface to customer service features associated with the television receiver. These and other desirable features and characteristics will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and this background section.
According to various exemplary embodiments, systems and methods are provided for graphically providing customer service features with a set-top box (STB) or other video receiver.
In various embodiments, a method is provided for graphically providing customer service features on the television receiver in response to viewer instructions received from a remote control. Imagery including a customer service tile is presented on the display. A two-dimensional input is received from the remote control that indicates that the tile has been selected, and, in response to the received two-dimensional input, the customer service feature is provided. The customer service feature may provide information received from a remote source via a network, programming connection or other medium.
Other embodiments provide a method for providing a customer service feature in a television receiver configured to present imagery on a display in response to viewer instructions transmitted from a remote control. The method comprises storing customer service information received from a remote source at the television receiver, presenting the imagery on the display, wherein the imagery comprises a tile corresponding to the customer service feature, receiving a two-dimensional input from the remote control that indicates that the tile has been selected by the viewer, and, in response to the received two-dimensional input, providing the stored customer service information on the display.
Still other embodiments provide a video receiver for presenting imagery on a display in response to viewer input signals provided from a remote control. The receiver comprises a receiver interface configured to receive an incoming modulated signal, a decoder configured to decode the incoming modulated signal to extract a video signal, a wireless receiver configured to receive the viewer input signals from the remote control, wherein the viewer input is a two-dimensional input, and a processor. The processor is configured to generate the imagery presented on the display, wherein the imagery comprises a tile corresponding to a customer service feature, and wherein the processor is further configured to receive the viewer input signals from the remote control, and, in response to the received two-dimensional input indicating that the tile has been selected by the viewer, to provide the customer service feature associated with the selected tile.
Various other embodiments, aspects and other features are described in more detail below.
Exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and
The following detailed description of the invention is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.
Generally speaking, the viewer is able to access channels, programs, program guide information, placeshifting features, customer service features and/or any other features through a graphical user interface that includes various tiles that can be selected using a two-dimensional input device. In various embodiments, the viewer manipulates a touchpad or other two-dimensional input feature incorporated within a remote control to direct a cursor toward one or more tiles. By selecting the tile, a feature associated with the tile can be accessed. Features that can be associated with tiles include tuning particular channels or programs, setting recordings or other features associated with a PVR or placeshifting feature, accessing further menuing features provided by the receiver, and/or the like. Further, in many embodiments, the features associated with one or more tiles may be individually configured by the viewer and/or a content provider to customize the interface provided to the viewer by the receiver.
Turning now to the drawing figures and with initial reference to
In the exemplary view shown in
Television imagery is presented on display 102 as desired by the viewer. Further, two-dimensional navigation features may be presented to allow the viewer to access various features of receiver 108 through control of a cursor 114 or other interface feature via remote control 112. In various embodiments, cursor 114 is able to move in response to two-dimensional input signals 127, which are, in turn, generated in response to inputs applied to two-dimensional input device 123. By moving cursor 114 to interact with the two-dimensional navigation features presented on display 102, various channels, programs, and/or other features may be tuned, activated or otherwise manipulated as desired.
Receiver 108 is any component, device or logic capable of receiving and decoding video signals 105. In various embodiments, receiver 108 is a set-top box (STB) or the like capable of receiving satellite, cable, broadcast and/or other signals encoding audio/visual content. Receiver 108 may further demodulate or otherwise decode the received signals 105 to extract programming that can be locally viewed on display 102 as desired. Receiver 108 may also include a content database stored on a hard disk drive, memory, or other storage medium to support a digital or other personal video recorder (DVR/PVR) feature in some embodiments. Receiver 108 may also provide place shifting, electronic program guide, multi-stream viewing and/or other features as appropriate.
In the exemplary embodiment illustrated in
Display 102 is any device capable of presenting imagery 110 to a viewer. In various embodiments, display 102 is a conventional television set, such as any sort of television operating in accordance with any digital or analog protocols, standards or other formats. Display 102 may be a conventional NTSC or PAL television receiver, for example. In other embodiments, display 102 is a monitor or other device that may not include built-in receiver functionality, but that is nevertheless capable of presenting imagery in response to signal 107 received from receiver 108. In various embodiments, receiver 108 and display 102 may be physically combined or interconnected in any manner. A receiver card, for example, could be inserted into a slot or other interface in a conventional television, or the functionality of receiver 108 may be provided within a conventional television display 102. In other embodiments, signals 107 are transferred between receiver 108 and display 102 using any sort of cable or other interface (including a wireless interface). Examples of common interfaces include, without limitation, component video, S-video, High-Definition Multimedia Interface (HDMI), Digital Visual Interface (DVI), IEEE 1394, and/or any other formats as desired.
Remote control 112 is any sort of control device capable providing signals 127 to receiver 108 that represent inputs received from one or more viewers. Typically, remote control 112 is an infrared, radio frequency (RF) or other wireless remote that includes any number of buttons or other features for receiving viewer inputs. In an exemplary embodiment, remote control 112 communicates with receiver 108 using the IEEE 802.15.4 (“ZIGBEE”) protocol, the RF for consumer electronics (e.g., “RF4CE” or “EC Net”) protocols, and/or any other standard or non-standard technique for implementing wireless personal area networks (WPANs). Other embodiments may instead communicate using IEEE 802.15.1 (“BLUETOOTH”), IEEE 802.11 (“WI-FI”), conventional infrared, and/or any other wireless techniques. In some embodiments, remote control 112 may be able to support multiple types of wireless communication, such as WPAN communications and also infrared communications. This feature may be useful when remote control 112 is a so-called universal remote that is able to provide input signals 127 to multiple devices.
Remote control 112 generally includes any sort of buttons, sliders, rocker switches and/or other features for receiving physical inputs from the viewer. As the user depresses or otherwise interacts with the features, remote control 112 suitably produces wireless signals 127 in response. In further embodiments, remote control 112 includes a two-dimensional input device 123 that is able to receive inputs from the user in any multi-dimensional format (e.g, “X,Y”, “r,Θ”, and/or the like). Examples of two-dimensional input devices 123 that could be used in various embodiments include, without limitation, touchpads, directional pads, joysticks, trackballs, sets of arrows or other buttons, and/or the like. In a typical implementation, two-dimensional input device 123 provides coordinates or other signals 127 that indicate absolute (e.g, “X,Y”) and/or relative (e.g., “ΔX,ΔY”) movement in two or more dimensions. Such signals 127 may be decoded at controller 108 or elsewhere to coordinate the viewer's actions with respect to input device 123 to movement of cursor 114 or other features presented on display 102.
In the exemplary embodiment shown in
In operation, then, receiver 108 suitably receives television signals 105 from a satellite, cable, broadcast or other source. In a satellite based embodiment, for example, one or more channels can be extracted from a conventional satellite feed; the video content on the selected channel can be demodulated, extracted and otherwise processed as appropriate to display the desired content to the viewer. One or more cable or broadcast channels may be similarly obtained in any manner. In some embodiments, receiver 108 may obtain multiple channel signals from different sources (e.g., one channel from a cable or satellite source and another channel from a terrestrial broadcast, DVD or other source). In still further embodiments, signals 105 may also provide EPG data, signaling information, marketing or promotional content, customer-specific information such as billing or services information, and/or the like.
Receiver 108 suitably obtains the desired content from the channel(s) indicated by the viewer, and presents the content on display 102. In various embodiments, viewers are able to further view imagery (e.g., the imagery 110 shown in
Imagery 110 may be organized and presented in any manner. In the exemplary embodiment shown in
Imagery 110 may also include other features as appropriate. The exemplary imagery 110 shown in
In the exemplary embodiment shown in
The particular tiles 124A-E shown in
In various embodiments, the particular tiles displayed on any window 126 are configurable so that the viewer and/or a service provider are able to choose particular tiles for presentation on display 102. Tab 126C, for example, may allow a customized set of tiles to be provided for a particular viewer or receiver 104. Tabs 126 may be selected in any manner. Viewers may be able to drag tiles, for example, from the default view (tab 126A) or another view (e.g., the “all tiles” tab 126B) to the custom tile view (e.g., tab 126C). Other embodiments may select tabs for custom lists in any other manner. Further, the features performed by certain tiles may be configurable. A “favorite channel” tile, for example, could be configured to tune to a channel that is configurable by the viewer, since this feature would vary from viewer to viewer. Other parameters may be adjusted based on temporal factors, viewer preferences, and/or other factors as appropriate.
In various embodiments, imagery 110 also includes a help feature that is accessible through a tile, icon or other help indicator 134. When the viewer selects indicator 134 (e.g., by directing cursor 114 toward the indicator 134 and then selecting the feature), additional information can be provided to the viewer. Such information may include context-specific instructions for using the particular window(s) on the display, instructions for using one or more tiles 124 or features associated with any tile 124, and/or any other information as desired.
Receiver 108 may be physically and logically implemented in any manner.
Various embodiments of receiver 108 therefore include any number of appropriate modules for obtaining and processing media content as desired for the particular embodiment. Each of these modules may be implemented in any combination of hardware and/or software using logic executed within any number of semiconductor chips or other processing logic.
Various embodiments of control logic 205 can include any circuitry, components, hardware, software and/or firmware logic capable of controlling the various components of receiver 108. Various routines, methods and processes executed within receiver 108 are typically carried out under control of control logic 205, as described more fully below. Generally speaking, control logic 205 receives user input signals 127 (
As noted above, receiver 108 suitably includes a receiver interface 208, which is any hardware, software, firmware and/or other logic capable of receiving media content via one or more content sources 105. In various embodiments, content sources 105 may include cable television, direct broadcast satellite (DBS), broadcast and/or other programming sources as appropriate. Receiver interface 208 appropriately selects a desired input source and provides the received content to an appropriate destination for further processing. In various embodiments, received programming may be provided in real-time (or near real-time) to a transport stream select module 212 or other component for immediate decoding and presentation to the user. Alternatively, receiver interface 208 may provide content received from any source to a disk or other storage medium in embodiments that provide DVR functionality. In such embodiments, receiver 108 may also include a disk controller module 206 that interacts with an internal or external hard disk, memory and/or other device 110 that stores content in a database or other filing system, as desired.
In the embodiment shown in
Various embodiments of receiver 108 are able to store information 245 on storage medium 110 for later retrieval. Such information 245 may include customer-specific billing or service information, audio/video clips for promotional, educational or other purposes, and/or any other information as desired. This information may be obtained via the receiver interface 208 (e.g., from a satellite, cable or other programming signal 105), via network interface 210, or from any other source as desired. In some embodiments, information 245 may be received in the form of a broadcast message transmitted as part of a satellite or cable signal 105 that includes customer-specific information associated with a particular viewer or a particular receiver 108. This information may be encoded and/or encrypted as desired, and may be indexed, for example, to a unique code associated with receiver 108. In such embodiments receiver 108 is able to extract customer or receiver-specific information from the broadcast message using any appropriate techniques. One technique for obtaining billing, services and/or other customer-specific information via a satellite, cable or other programming connection is described in U.S. patent application Ser. No. 12/197,100 entitled “Systems and Methods for High Bandwidth Delivery of Customer-Specific Information” and filed on Aug. 22, 2008, although other techniques could be used in other embodiments. Other embodiments may obtain information 245 through a back-channel query (e.g., using network interface 210) to a remote server. In other embodiments, information 245 may not be downloaded, but rather may be “burned in” or otherwise stored on receiver 108 before receiver 108 is distributed to the viewer.
Transport stream select module 212 is any hardware and/or software logic capable of selecting a desired media stream from the available sources. In the embodiment shown in
Receiver 108 may include any number of decoder modules 214 for decoding, decompressing and/or otherwise processing received/stored content as desired. Generally speaking, decoder module 214 decompresses, decodes and/or otherwise processes received content from stream select module 212 to extract an MPEG or other media stream encoded within the stream. The decoded content can then be processed by one or more display processor modules 218 to create a presentation on display 102 (
Display processor module 218 includes any appropriate hardware, software and/or other logic to create desired screen displays via display interface 228 as desired. Such displays may include combining signals received from one or more decoder modules 214 to facilitate viewing of one or more channels. In various embodiments, display processing module 218 is also able to produce on screen displays (OSDs) for electronic program guide, setup and control, input/output facilitation and/or other features that may vary from embodiment to embodiment. Such displays are not typically contained within the received or stored broadcast stream, but are nevertheless useful to users in interacting with receiver 108 or the like. The generated displays, including received/stored content and any other displays may then be presented to one or more output interfaces 228 in any desired format. The various interface features described herein, for example, may be generated by display processor module 218 operating alone or in conjunction with control logic 205.
Display processor 218 may also generate imagery 110 in response to viewer inputs received (and/or in response to instructions from command logic 205) to thereby make up a user interface that allows the viewer select channels or programs, or to perform other tasks as desired. When the viewer provides inputs at tiles 124 or any other user interface features, for example, display processor 218 may be operable to draw (or redraw) imagery 110 in response, and/or to present television content identified by the viewer, as appropriate. As receiver 108 receives user inputs 127 from remote control 112, control logic 205 may direct display processor 218 to adjust any feature(s) of imagery 110 as directed by the viewer. Display processor 218 therefore directs the presentation of imagery 110 in conjunction with one or more navigation features, and adjusts the imagery 110 in response to inputs received from the viewer.
Display processor 218 produces an output signal encoded in any standard format (e.g., ITU656 format for standard definition television signals or any format for high definition television signals) that can be readily converted to standard and/or high definition television signals at interface 228. In other embodiments, the functionality of display processor 218 and interface 228 may be combined in any manner.
In the embodiment shown in
As noted above, the particular features executed by any particular tile can vary widely from embodiment to embodiment, and even from tile to tile. Selecting a network tile (e.g., tile 124A), for example, may result in any actions associated with that network being executed. A particular channel could be tuned, for example, or other features could be provided as appropriate. Such features may include, for example, specific information about the network or programs produced by the network. Other features could include scheduling or program guide information that is specific to the network. In still other embodiments, the viewer may be able to select and view preview clips, informational clips, or other additional content about programs shown on the associated network. Such information may be downloaded, for example, via a digital network or via satellite, cable and/or other programming signals 105 delivered to receiver 108. In some embodiments, tiles (e.g., network tile 124A) could be sponsored tiles in which a DBS, cable or other television service provider send instructions to receiver 108 to display a sponsored tile 124A at appropriate times.
Similarly, tiles associated with particular programs (e.g, tile 124H in
Channel tiles (e.g., tiles 124F-G) may be associated with particular channels in any manner. Selection of these tiles may result in the particular channel being immediately tuned by receiver 108 so that programming on that channel can be presented. In other embodiments, program guide information for that channel can be displayed so the viewer can see upcoming programming on that channel. In still other embodiments, particular channels may provide other information (e.g., RSS feeds for weather, traffic, local news and/or other information) that can be displayed in response to selection of the tile. Other features may be alternately or additionally provided as desired.
As noted above, other tiles could be formulated for any purpose or feature. Other tiles may allow for viewer settings, for example, that would allow the viewer to configure receiver 108 in any manner. Tiles 124B, 124C, 124I shown in
Customer service features may be implemented in any manner. In various embodiments, a customer service tile 124E is provided that may be selected by the viewer as desired. Customer service tile 124E may be provided in any window 125 or other feature that provides convenient access to the viewer. As noted above, a viewer selects tile 124E using cursor 114 and/or remote control 112 as appropriate. In response to the selection of the customer service tile 124E, customer service features may be provided as desired.
Customer service features provided by receiver 108 may vary from embodiment to embodiment. In some embodiments, such features may simply provide information such as assistance windows (e.g., in response to selection of help indicator 134), user guide information, information about configuring or operating receiver 108, and/or the like. Such information may be stored (e.g., as information 245) within receiver 108 as desired. In further embodiments, customer or receiver specific information such as billing information, service information and/or the like may be provided. The imagery 110 shown in
In various further embodiments, an option to “pay now” could be additionally provided. By clicking tile 406, for example, the viewer could be presented with a window or other imagery 110 that allows for entry of bill payment information (e.g., a credit card number) via remote control 112 or the like. Such information may be entered, for example, using a keypad on remote control 112, a “virtual keypad” presented in imagery 110 that interacts with cursor 114, and/or the like. Payment information entered could be provided to a remote service center via network interface 210 or the like. Similarly, various embodiments may provide a tile 410 or other feature that allows the viewer to order additional services (e.g., pay-per-view, on demand or other services; or to change a bundle of ordered services from a DBS, cable or other service provider) using similar techniques. In embodiments where a backchannel is not available or convenient, however, “pay now” or “additional services” functionality could be omitted without affecting the other features provided.
The customer service information presented as part of imagery 110 may be received via receiver interface 208 and/or network interface 210, or may be partially or entirely collected at receiver 108 as services (such as pay-per-view or the like) are requested. Some or all of the information contained in these windows may be extracted from information 245 (
With reference now to
Customer service information is received in any manner. As noted above, information 245 may be stored on receiver 108 during initial configuration in some embodiments. Alternately or additionally, information 245 may be received from a remote source via interface 208 and/or interface 210. In various embodiments, information 245 may be obtained from a satellite or cable broadcast, with customer or receiver-specific information extracted from the broadcast message as desired. In still further embodiments, customer service information 245 may be obtained in real-time (or near real-time) in response to a request by receiver 108; such a request may be posited via network interface 210 or the like. Received information 245 may be stored (e.g., on storage medium 110 or any other medium available to receiver 108) as desired.
Step 502 suitably involves presenting imagery 110 with one or more tiles 124 (e.g., tiles 124A-I in
The viewer interacts with the presented tiles 124 in any manner. As noted above, in various embodiments the viewer provides inputs to receiver 104 using a remote control 112 that incorporates a two-dimensional input device 123 such as a touchpad, motion sensor, directional pad, joystick, trackball and/or the like. Signals 127 from remote control 112 provide receiver 104 with appropriate information to direct the position of cursor 114 on imagery 110, and to indicate viewer selections of tiles 124 as appropriate.
In response to the viewer selecting customer service tile 124E, receiver 104 suitably provides the feature(s) associated with the selected tile in any manner (step 408). The particular customer service features, as noted above, can vary significantly from embodiment to embodiment and tile to tile. Some features may include tuning to a selected channel or program that provides customer specific information, providing downloaded content (including customer or receiver specific content, as described above), providing an interface for additional options (e.g., obtaining additional information such as billing details, listings of services and/or the like), processing bill payment, and/or taking any other actions as appropriate.
Accordingly, new systems and techniques are presented for graphically providing customer service features in a television receiver or the like using two-dimensional graphical interaction between the viewer and a tile or other feature presented on display 102.
As used herein, the word “exemplary” means “serving as an example, instance, or illustration.” Any implementation described herein as exemplary is not necessarily to be construed as preferred or advantageous over other implementations.
While the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing various embodiments of the invention, it should be appreciated that the particular embodiments described above are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. To the contrary, various changes may be made in the function and arrangement of elements described without departing from the scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
4450477 | Lovett | May 1984 | A |
4725888 | Hakamada | Feb 1988 | A |
4739510 | Jeffers et al. | Apr 1988 | A |
4852019 | Vinberg et al. | Jul 1989 | A |
4868785 | Jordan et al. | Sep 1989 | A |
5187776 | Yanker | Feb 1993 | A |
5260778 | Kauffman et al. | Nov 1993 | A |
5428734 | Haynes et al. | Jun 1995 | A |
5438372 | Tsumori et al. | Aug 1995 | A |
5450536 | Rosenberg et al. | Sep 1995 | A |
5453796 | Duffield et al. | Sep 1995 | A |
5539478 | Bertram et al. | Jul 1996 | A |
5539479 | Bertram | Jul 1996 | A |
5545857 | Lee et al. | Aug 1996 | A |
5548340 | Bertram | Aug 1996 | A |
5559961 | Blonder | Sep 1996 | A |
5585866 | Miller et al. | Dec 1996 | A |
5589893 | Gaughan et al. | Dec 1996 | A |
5594469 | Freeman et al. | Jan 1997 | A |
5594509 | Florin et al. | Jan 1997 | A |
5602597 | Bertram | Feb 1997 | A |
5604544 | Bertram | Feb 1997 | A |
5606374 | Bertram | Feb 1997 | A |
5650827 | Tsumori et al. | Jul 1997 | A |
5652630 | Bertram et al. | Jul 1997 | A |
5659369 | Imaiida | Aug 1997 | A |
5677708 | Matthews, III et al. | Oct 1997 | A |
5682489 | Harrow et al. | Oct 1997 | A |
5721815 | Ottesen et al. | Feb 1998 | A |
5721878 | Ottesen et al. | Feb 1998 | A |
5742286 | Kung et al. | Apr 1998 | A |
5751883 | Ottesen et al. | May 1998 | A |
5754258 | Hanaya et al. | May 1998 | A |
5757358 | Osga | May 1998 | A |
5767840 | Selker | Jun 1998 | A |
5768158 | Adler et al. | Jun 1998 | A |
5774186 | Brodsky et al. | Jun 1998 | A |
5786805 | Barry | Jul 1998 | A |
5801747 | Bedard | Sep 1998 | A |
5805235 | Bedard | Sep 1998 | A |
5808601 | Leah et al. | Sep 1998 | A |
5809265 | Blair et al. | Sep 1998 | A |
5815216 | Suh | Sep 1998 | A |
5825361 | Rubin et al. | Oct 1998 | A |
5831591 | Suh | Nov 1998 | A |
5831607 | Brooks | Nov 1998 | A |
5867162 | O'Leary et al. | Feb 1999 | A |
5874953 | Webster et al. | Feb 1999 | A |
5898431 | Webster et al. | Apr 1999 | A |
5905496 | Lau et al. | May 1999 | A |
5917488 | Anderson et al. | Jun 1999 | A |
5917489 | Thurlow et al. | Jun 1999 | A |
5936623 | Amro | Aug 1999 | A |
5949417 | Calder | Sep 1999 | A |
5956025 | Goulden et al. | Sep 1999 | A |
5966121 | Hubbell et al. | Oct 1999 | A |
5978043 | Blonstein et al. | Nov 1999 | A |
5999228 | Matsuura et al. | Dec 1999 | A |
6005565 | Legall | Dec 1999 | A |
6008735 | Chiloyan et al. | Dec 1999 | A |
6008860 | Patton et al. | Dec 1999 | A |
6018342 | Bristor | Jan 2000 | A |
6020930 | Legrand | Feb 2000 | A |
6052121 | Webster et al. | Apr 2000 | A |
6057841 | Thurlow et al. | May 2000 | A |
6064376 | Berezowski et al. | May 2000 | A |
6078308 | Rosenberg et al. | Jun 2000 | A |
6088029 | Guiberson et al. | Jul 2000 | A |
6118442 | Tanigawa | Sep 2000 | A |
6118498 | Reitmeier | Sep 2000 | A |
6125374 | Terry et al. | Sep 2000 | A |
6141003 | Chor et al. | Oct 2000 | A |
6147714 | Terasawa et al. | Nov 2000 | A |
6173112 | Gruse et al. | Jan 2001 | B1 |
6191773 | Maruno et al. | Feb 2001 | B1 |
6208341 | van Ee et al. | Mar 2001 | B1 |
6208804 | Ottesen et al. | Mar 2001 | B1 |
6215417 | Krass et al. | Apr 2001 | B1 |
6233389 | Barton et al. | May 2001 | B1 |
6266098 | Cove et al. | Jul 2001 | B1 |
6281940 | Sciammarella | Aug 2001 | B1 |
6334217 | Kim | Dec 2001 | B1 |
6493036 | Fernandez | Dec 2002 | B1 |
6498628 | Iwamura | Dec 2002 | B2 |
6526577 | Knudson et al. | Feb 2003 | B1 |
6529685 | Ottesen et al. | Mar 2003 | B2 |
6556252 | Kim | Apr 2003 | B1 |
6650248 | O'Donnell et al. | Nov 2003 | B1 |
6678009 | Kahn | Jan 2004 | B2 |
6697123 | Janevski et al. | Feb 2004 | B2 |
6750803 | Yates et al. | Jun 2004 | B2 |
6750887 | Kellerman et al. | Jun 2004 | B1 |
6774914 | Benayoun | Aug 2004 | B1 |
6804824 | Potrebic et al. | Oct 2004 | B1 |
6816442 | Heiman et al. | Nov 2004 | B1 |
6822698 | Clapper | Nov 2004 | B2 |
6882712 | Iggulden et al. | Apr 2005 | B1 |
6934963 | Reynolds et al. | Aug 2005 | B1 |
6943845 | Mizutome et al. | Sep 2005 | B2 |
7046161 | Hayes | May 2006 | B2 |
7061544 | Nonomura et al. | Jun 2006 | B1 |
7148909 | Yui et al. | Dec 2006 | B2 |
7171622 | Bhogal | Jan 2007 | B2 |
7196733 | Aratani et al. | Mar 2007 | B2 |
7206029 | Cohen-Solal | Apr 2007 | B2 |
7225456 | Kitsukawa et al. | May 2007 | B2 |
7231603 | Matsumoto | Jun 2007 | B2 |
7268830 | Lee | Sep 2007 | B2 |
7370284 | Andrea et al. | May 2008 | B2 |
7420620 | Habas et al. | Sep 2008 | B2 |
7434246 | Florence | Oct 2008 | B2 |
7440036 | Onomatsu et al. | Oct 2008 | B2 |
7584492 | Terakado et al. | Sep 2009 | B2 |
7600201 | Endler et al. | Oct 2009 | B2 |
7620966 | Kitamori | Nov 2009 | B2 |
7636131 | Hsieh et al. | Dec 2009 | B2 |
7707599 | Groff et al. | Apr 2010 | B1 |
7746332 | Le Leannec et al. | Jun 2010 | B2 |
7876382 | Imaizumi | Jan 2011 | B2 |
7880813 | Nakamura et al. | Feb 2011 | B2 |
8001566 | Jang | Aug 2011 | B2 |
8005826 | Sahami et al. | Aug 2011 | B1 |
8239784 | Hotelling et al. | Aug 2012 | B2 |
20010011953 | Shintani et al. | Aug 2001 | A1 |
20010017672 | Verhaeghe | Aug 2001 | A1 |
20020054062 | Gerba et al. | May 2002 | A1 |
20020057382 | Yui | May 2002 | A1 |
20020059599 | Schein et al. | May 2002 | A1 |
20020060754 | Takeuchi | May 2002 | A1 |
20020070957 | Trajkovic et al. | Jun 2002 | A1 |
20020075333 | Dutta et al. | Jun 2002 | A1 |
20020075407 | Cohen-Solal | Jun 2002 | A1 |
20020097229 | Rose et al. | Jul 2002 | A1 |
20020122027 | Kim | Sep 2002 | A1 |
20020122079 | Kamen et al. | Sep 2002 | A1 |
20020129366 | Schein et al. | Sep 2002 | A1 |
20020174430 | Ellis et al. | Nov 2002 | A1 |
20020178446 | Sie et al. | Nov 2002 | A1 |
20020191954 | Beach | Dec 2002 | A1 |
20030001908 | Cohen-Solal | Jan 2003 | A1 |
20030005443 | Axelsson et al. | Jan 2003 | A1 |
20030005445 | Schein et al. | Jan 2003 | A1 |
20030018973 | Thompson | Jan 2003 | A1 |
20030025716 | Colavin | Feb 2003 | A1 |
20030066079 | Suga | Apr 2003 | A1 |
20030086694 | Davidsson | May 2003 | A1 |
20030115589 | D'Souza et al. | Jun 2003 | A1 |
20030126607 | Phillips et al. | Jul 2003 | A1 |
20030131356 | Proehl et al. | Jul 2003 | A1 |
20030191947 | Stubblefield et al. | Oct 2003 | A1 |
20030193426 | Vidal | Oct 2003 | A1 |
20030208751 | Kim et al. | Nov 2003 | A1 |
20040041723 | Shibamiya et al. | Mar 2004 | A1 |
20040070593 | Neely et al. | Apr 2004 | A1 |
20040107439 | Hassell et al. | Jun 2004 | A1 |
20040111744 | Bae et al. | Jun 2004 | A1 |
20040168191 | Jerding et al. | Aug 2004 | A1 |
20040172651 | Wasilewski et al. | Sep 2004 | A1 |
20040201780 | Kim | Oct 2004 | A1 |
20040218905 | Green et al. | Nov 2004 | A1 |
20040230843 | Jansen | Nov 2004 | A1 |
20040255336 | Logan et al. | Dec 2004 | A1 |
20050002649 | Boyle et al. | Jan 2005 | A1 |
20050010949 | Ward et al. | Jan 2005 | A1 |
20050015803 | Macrae et al. | Jan 2005 | A1 |
20050076361 | Choi et al. | Apr 2005 | A1 |
20050084233 | Fujii et al. | Apr 2005 | A1 |
20050128366 | Cha | Jun 2005 | A1 |
20050188402 | de Andrade et al. | Aug 2005 | A1 |
20050190280 | Haas et al. | Sep 2005 | A1 |
20050251826 | Orr | Nov 2005 | A1 |
20050268100 | Gasparini et al. | Dec 2005 | A1 |
20060037047 | DeYonker et al. | Feb 2006 | A1 |
20060051058 | Rudolph et al. | Mar 2006 | A1 |
20060061668 | Ise | Mar 2006 | A1 |
20060061688 | Choi | Mar 2006 | A1 |
20060064700 | Ludvig et al. | Mar 2006 | A1 |
20060084409 | Ghadiali | Apr 2006 | A1 |
20060095401 | Krikorian et al. | May 2006 | A1 |
20060184900 | Ishii et al. | Aug 2006 | A1 |
20060236342 | Kunkel et al. | Oct 2006 | A1 |
20070019111 | Won | Jan 2007 | A1 |
20070039019 | Collier | Feb 2007 | A1 |
20070039020 | Cansler, Jr. et al. | Feb 2007 | A1 |
20070061724 | Slothouber et al. | Mar 2007 | A1 |
20070074254 | Sloo | Mar 2007 | A1 |
20070079334 | Silver | Apr 2007 | A1 |
20070115391 | Anderson | May 2007 | A1 |
20070130607 | Thissen et al. | Jun 2007 | A1 |
20070186231 | Haeuser et al. | Aug 2007 | A1 |
20070192791 | Sullivan et al. | Aug 2007 | A1 |
20070195197 | Seong et al. | Aug 2007 | A1 |
20070199022 | Moshiri et al. | Aug 2007 | A1 |
20070266397 | Lin | Nov 2007 | A1 |
20070277224 | Osborn et al. | Nov 2007 | A1 |
20080010518 | Jiang et al. | Jan 2008 | A1 |
20080024682 | Chen | Jan 2008 | A1 |
20080034314 | Louch et al. | Feb 2008 | A1 |
20080052245 | Love | Feb 2008 | A1 |
20080066102 | Abraham et al. | Mar 2008 | A1 |
20080074550 | Park | Mar 2008 | A1 |
20080088495 | Kawakita | Apr 2008 | A1 |
20080129886 | Ishihara | Jun 2008 | A1 |
20080147803 | Krzyzanowski et al. | Jun 2008 | A1 |
20080184324 | Yun et al. | Jul 2008 | A1 |
20080222523 | Fox et al. | Sep 2008 | A1 |
20080229254 | Warner | Sep 2008 | A1 |
20080231762 | Hardacker et al. | Sep 2008 | A1 |
20080235735 | Wroblewski | Sep 2008 | A1 |
20080263595 | Sumiyoshi et al. | Oct 2008 | A1 |
20090007209 | Kawai | Jan 2009 | A1 |
20090031335 | Hendricks et al. | Jan 2009 | A1 |
20090031343 | Sharkey | Jan 2009 | A1 |
20090070815 | Barrett et al. | Mar 2009 | A1 |
20090141024 | Lee et al. | Jun 2009 | A1 |
20090241145 | Sharma | Sep 2009 | A1 |
20100037180 | Elias et al. | Feb 2010 | A1 |
20100050199 | Kennedy | Feb 2010 | A1 |
20100071004 | Wightman | Mar 2010 | A1 |
20100074592 | Taxier et al. | Mar 2010 | A1 |
20100077432 | VanDuyn et al. | Mar 2010 | A1 |
20100079671 | VanDuyn et al. | Apr 2010 | A1 |
20100079681 | Coburn et al. | Apr 2010 | A1 |
20100083310 | VanDuyn et al. | Apr 2010 | A1 |
20100083312 | White et al. | Apr 2010 | A1 |
20100083313 | White et al. | Apr 2010 | A1 |
20100083315 | White et al. | Apr 2010 | A1 |
20100083319 | Martch et al. | Apr 2010 | A1 |
20100100909 | Arsenault et al. | Apr 2010 | A1 |
20100115550 | Minnick et al. | May 2010 | A1 |
20100118211 | Carlsgaard et al. | May 2010 | A1 |
20100169958 | Werner et al. | Jul 2010 | A1 |
20100231525 | Chen | Sep 2010 | A1 |
20110018817 | Kryze et al. | Jan 2011 | A1 |
Number | Date | Country |
---|---|---|
1063797 | Dec 2000 | EP |
1158793 | Nov 2001 | EP |
200729167 | Jan 2007 | TW |
0001142 | Jan 2000 | WO |
0145395 | Jun 2001 | WO |
0178054 | Oct 2001 | WO |
0178383 | Oct 2001 | WO |
02087243 | Oct 2002 | WO |
03043320 | May 2003 | WO |
2006119269 | Nov 2006 | WO |
20060127211 | Nov 2006 | WO |
2007015047 | Feb 2007 | WO |
2008013350 | Jan 2008 | WO |
Entry |
---|
International Searching Authority, European Patent Office, “International Search Report,” mailed Dec. 7, 2009; International Application No. PCT/US2009/058457, filed Sep. 25, 2009. |
International Searching Authority, European Patent Office, “International Search Report and Written Opinion,” mailed Dec. 18, 2009; International Application No. PCT/US2009/058456, filed Sep. 25, 2009. |
International Searching Authority, European Patent Office, “International Search Report and Written Opinion,” mailed Dec. 21, 2009; International Application No. PCT/US2009/058454 filed Sep. 25, 2009. |
Anonymous “ZigBee,” Wikipedia, the Free Encyclopedia [online], Sep. 26, 2008, XP002558439; retrieved from the Internet: <URL:http://en.wikipedia.org/w/index.php?title=ZigBee&oldid=241085798> [retrieved on Dec. 2, 2009]. |
International Searching Authority, European Patent Office, “International Search Report,” mailed Feb. 4, 2010; International Application No. PCT/US2009/058937, filed Sep. 30, 2009. |
International Searching Authority, European Patent Office, “International Search Report,” mailed Feb. 16, 2010; International Application No. PCT/US2009/057582, filed Sep. 18, 2009. |
Wightman, Robert Edward “Methods and Apparatus for Providing Multiple Channel Recall on a Television Receiver,” U.S. Appl. No. 12/233,274, filed Sep. 18, 2008. |
White, James Matthew et al. “Systems and Methods for Configuration of a Remote Control Device,” U.S. Appl. No. 12/241,550, filed Sep. 30, 2008. |
White, James Matthew et al. “Systems and Methods for Graphical control of User Interface Features Provided by a Television Receiver,” U.S. Appl. No. 12/241,556, filed Sep. 30, 2008. |
Vanduyn, Luke et al. “Systems and Methods for Graphical Control of Picture-In-Picture Windows,” U.S. Appl. No. 12/241,571, filed Sep. 30, 2008. |
Minnick, Danny Jean et al., “Graphical Interface Navigation Based on Image Element Proximity,” U.S. Appl. No. 12/609,860, filed Oct. 30, 2009. |
Martch, Henry Gregg “Systems and Methods for Automatic Configuration of a Remote Control Device,” U.S. Appl. No. 12/242,089, filed Sep. 30, 2008. |
White, James Matthew et al. “Systems and Methods for Graphical Control of User Interface Features in a Television Receiver,” U.S. Appl. No. 12/241,599, filed Sep. 30, 2008. |
Coburn, Matthew et al. “Systems and Methods for Graphical Control of Symbol-Based Features in a Television Receiver,” U.S. Appl. No. 12/241,604, filed Sep. 30, 2008. |
White, James Matthew et al. “Systems and Methods for Graphical Adjustment of an Electronic Program Guide,” U.S. Appl. No. 12/241,608, filed Sep. 30, 2008. |
Vanduyn, Luke et al. “Methods and Apparatus for Presenting Supplemental Information in an Electronic Programming Guide,” U.S. Appl. No. 12/235,476, filed Sep. 22, 2008. |
Vanduyn, Luke et al. “Methods and Apparatus for Providing Multiple Channel Recall on a Television Receiver,” U.S. Appl. No. 12/242,587, filed Sep. 30, 2008. |
Taxier, Karen Michelle et al. “Methods and Apparatus for Visually Displaying Recording Timer Information,” U.S. Appl. No. 12/235,464, filed Sep. 22, 2008. |
Martch, Henry Gregg et al. “Methods and Apparatus for Locating Content in an Electronic Programming Guide,” U.S. Appl. No. 12/242,614, filed Oct. 17, 2008. |
Taxier, Karen Michelle et al. “Apparatus and Methods for Dynamic Pictorial Image Authentication,” U.S. Appl. No. 12/236,430, filed Sep. 23, 2008. |
Wikipedia, the free encyclopedia, “Dashboard (Software),” Retrieved from the Internet on Oct. 6, 2008, http://en.wikipedia.org/w/index.php?title=Dashboard—(software)&printable=yes. |
Nintendo, “Wii Operations Manual System Setup,” 2007. |
International Searching Authority, European Patent Office, Annex to Form PCT/ISA/206, Communication Relating to the Results of the Partial International Search, mailed Nov. 16, 2009; International Application No. PCT/US2009/057825, filed Sep. 22, 2009. |
USPTO “Non-Final Office Action” mailed Nov. 24, 2010; U.S. Appl. No. 12/242,587, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Feb. 9, 2011; U.S. Appl. No. 12/241,608, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Jan. 28, 2011; U.S. Appl. No. 12/236,430, filed Sep. 23, 2008. |
USPTO “Non-Final office Action” mailed Feb. 4, 2011; U.S. Appl. No. 12/241,599, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Mar. 31, 2011; U.S. Appl. No. 12/241,556, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Dec. 21, 2010; U.S. Appl. No. 12/235,476, filed Sep. 22, 2008. |
The International Bureau of WIPO “International Preliminary Report on Patentability” mailed Apr. 14, 2011; International Appln. No. PCT/US2009/058236, filed Sep. 24, 2009. |
USPTO “Final Office Action” mailed May 13, 2011; U.S. Appl. No. 12/242,587, filed Sep. 30, 2008. |
European Patent Office, International Searching Authority, “International Search Report” mailed Nov. 10, 2009; International Appln. No. PCT/EP2009/061499. |
USPTO “Non-Final Office Action” mailed Jan. 31, 2011; U.S. Appl. No. 12/233,274, filed Sep. 18, 2008. |
USPTO “Final Office Action” mailed Aug. 26, 2011; U.S. Appl. No. 12/241,599, filed Sep. 30, 2008. |
USPTO “Final Office Action” mailed Oct. 5, 2011; U.S. Appl. No. 12/241,556, filed Sep. 30, 2008. |
USPTO “Final Office Action” mailed Oct. 21, 2011; U.S. Appl. No. 12/241,571, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Jul. 6, 2011; U.S. Appl. No. 12/241,571, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Jul. 12, 2011; U.S. Appl. No. 12/241,604, filed Sep. 30, 2008. |
USPTO “Final Office Action” mailed May 13, 2011; U.S. Appl. No. 12/235,476, filed Sep. 22, 2008. |
USPTO “Final Office Action” mailed Jul. 28, 2011; U.S. Appl. No. 12/241,608, filed Sep. 30, 2008. |
USPTO “Final Office Action” mailed Aug. 18, 2011; U.S. Appl. No. 12/233,274, filed Sep. 18, 2008. |
USPTO “Final Office Action” mailed Jan. 20, 2012; U.S. Appl. No. 12/241,604, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Mar. 7, 2012; U.S. Appl. No. 12/235,464, filed Sep. 22, 2008. |
USPTO “Non-Final Office Action” mailed Mar. 22, 2012; U.S. Appl. No. 12/241,556, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Apr. 17, 2012; U.S. Appl. No. 12/241,608, filed Sep. 30, 2008. |
USPTO “Final Office Action” mailed Apr. 25, 2012; U.S. Appl. No. 12/242,614, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Apr. 24, 2012; U.S. Appl. No. 12/235,476, filed Sep. 22, 2008. |
USPTO “Non-Final Office Action” mailed Nov. 23, 2011; U.S. Appl. No. 12/242,614, filed Sep. 30, 2008. |
USPTO “Final Office Action” mailed Dec. 7, 2011; U.S. Appl. No. 12/241,599, filed Sep. 30, 2008. |
USPTO “Final Office Action” mailed Aug. 8, 2012 for U.S. Appl. No. 12/241,556, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Jun. 28, 2012 for U.S. Appl. No. 12/241,571, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Jul. 26, 2012 for U.S. Appl. No. 12/609,860, filed Oct. 30, 2009. |
USPTO “Final Office Action” mailed Aug. 1, 2012 for U.S. Appl. No. 12/241,608, filed Sep. 30, 2008. |
USPTO “Final Office Action” mailed Aug. 2, 2012 for U.S. Appl. No. 12/241,599, filed Sep. 30, 2008. |
USPTO “Final Office Action” mailed Aug. 9, 2012 for U.S. Appl. No. 12/235,476, filed Sep. 22, 2008. |
USPTO “Final Office Action” mailed Oct. 9, 2012 for U.S. Appl. No. 12/235,464, filed Sep. 22, 2008. |
USPTO “Final Office Action” mailed Sep. 14, 2012 for U.S. Appl. No. 12/242,587, filed Sep. 30, 2008. |
Intellectual Property Office “Office Action” issued Oct. 25, 2012 for Taiwan Patent Appln No. 098127906. |
USPTO “Final Office Action” mailed Jan. 23, 2013 for U.S. Appl. No. 12/241,604, filed Sep. 30, 2008. |
USPTO “Final Office Action” mailed Nov. 13, 2012 for U.S. Appl. No. 12/609,860, filed Oct. 30, 2009. |
USPTO “Final Office Action” mailed Nov. 13, 2012 for U.S. Appl. No. 12/24,571, filed Sep. 30, 2008. |
USPTO “Non'Final Office Action” mailed Nov. 15, 2012 for U.S. Appl. No. 12/241,608, filed Sep. 30, 2008. |
USPTO “Non-Final Office Action” mailed Dec. 5, 2012 for U.S. Appl. No. 12/24,556, filed Sep. 30, 2008. |
Ntellectual Property Office “Office Action” issued Oct. 30, 2012 for Taiwan Patent Appln. No. 098127902. |
USPTO “Notice of Allowance” mailed Nov. 6, 2012 for U.S. Appl. No. 12/241,599, filed Sep. 30, 2008. |
USPTO “Non'Final Office Action” mailed Feb. 12, 2013 for U.S. Appl. No. 12/235,476. |
USPTO, “Notice of Allowance and Fee(s) Due” mailed May 7, 2013 for U.S. Appl. No. 12/241,608, filed Sep. 30, 2008. |
USPTO, “Final Office Action” mailed Jun. 7, 2013 for U.S. Appl. No. 12/241,556. |
USPTO, “Notice of Allowance and Fee(s) Due” mailed Aug. 14, 2013 for U.S. Appl. No. 12/235,476. |
USPTO, “Office Action” mailed Aug. 19, 2013 for U.S. Appl. No. 12/241,604. |
USPTO, “Office Action” mailed Sep. 11, 2013 for U.S. Appl. No. 12/241,556. |
USPTO, “Notice of Allowance and Fee(s) Due” mailed Sep. 13, 2013 for U.S. Appl. No. 12/235,464. |
USPTO, “Office Action” mailed Sep. 17, 2013 for U.S. Appl. No. 12/242,587. |
USPTO, “Office Action” mailed Sep. 25, 2013 for U.S. Appl. No. 12/609,860. |
USPTO, “Final Office Action” mailed Dec. 4, 2013 for U.S. Appl. No. 12/241,604. |
Number | Date | Country | |
---|---|---|---|
20100083309 A1 | Apr 2010 | US |