Systems and methods for graphical control of picture-in-picture windows

Information

  • Patent Grant
  • 9357262
  • Patent Number
    9,357,262
  • Date Filed
    Tuesday, September 30, 2008
    15 years ago
  • Date Issued
    Tuesday, May 31, 2016
    8 years ago
Abstract
Systems and methods are provided for presenting a picture-in-picture (PIP) window on a television or other display generated by a set-top box (STB) or other video receiver. The picture-in-picture window is presented in conjunction with a navigation feature on the display. A two-dimensional input associated with the navigation feature is received from a remote control having a touchpad or other two-dimensional input device. The picture-in-picture window on the display is appropriately adjusted in response to the two-dimensional input.
Description
TECHNICAL FIELD

The present invention generally relates to television viewing, and more particularly relates to systems and methods for providing graphical control of picture-in-picture windows displayed by set-top boxes or other television receivers.


BACKGROUND

Most television viewers now receive their television signals through a content aggregator such as a cable or satellite television provider. For subscribers to a direct broadcast satellite (DBS) service, for example, television programming is received via a broadcast that is sent via a satellite to an antenna that is generally located on the exterior of a home or other structure. Other customers receive television programming through conventional television broadcasts, or through cable, wireless or other media. Programming is typically received at a receiver such as a “set top box” (STB) or other receiver that demodulates the received signals and converts the demodulated content into a format that can be presented to the viewer on a television or other display.


In addition to receiving and demodulating television programming, many television receivers are able to provide additional features. Examples of features available in many modern television receivers include electronic program guides (EPGs), digital or other personal video recorders, “place-shifting” features for streaming received content over a network or other medium, and/or the ability to simultaneously view multiple programs showing on different channels. In the latter case, a “picture-in-picture” (PIP) display is typically provided wherein a relatively small image of a secondary program is superimposed upon a primary display. While television viewers have widely adopted PIP functionality, there nevertheless remains a desire to improve the configurability of PIP features. Moreover, there is a continual desire for more efficient and intuitive user interfaces to the various other features provided by the television receiver, including PIP features.


It is therefore desirable to create systems and methods improving the viewer interface to the television receiver for features such as picture-in-picture. These and other desirable features and characteristics will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and this background section.


BRIEF SUMMARY

According to various exemplary embodiments, systems and methods are provided for presenting a picture-in-picture (PIP) window on a television or other display generated by a set-top box (STB) or other video receiver.


In various embodiments, the picture-in-picture window is presented in conjunction with a navigation feature on the display. A two-dimensional input associated with the navigation feature is received from a remote control having a touchpad, directional pad, joystick, trackball, set of directional buttons and/or any other two-dimensional input device. The picture-in-picture window on the display is appropriately moved, resized, reordered or otherwise adjusted in response to the two-dimensional input.


In other embodiments, a video receiver suitably comprises a receiver interface configured to receive an incoming modulated signal and a decoder configured to decode the incoming modulated signal to obtain primary and secondary video signals. The video receiver further comprises a wireless receiver configured to receive a two-dimensional input signal, and a processor configured to generate an output image comprising the primary and secondary video signals in a picture-in-picture window in conjunction with a navigation feature and to adjust the picture-in-picture window on the display when the two-dimensional input signal corresponds to the navigation feature.


Still other embodiments provide a system for presenting television content on a display. A wireless remote control comprises a two-dimensional input device configured to provide a two-dimensional input signal in response to a user input. A video receiver comprises a receiver interface configured to receive an incoming modulated signal, a decoder configured to decode the incoming modulated signal to obtain primary and secondary video signals. The receiver further comprises a wireless receiver configured to receive the two-dimensional input signal from the wireless remote control, and a processor configured to generate an output image to be presented on the display, wherein the output image comprises the secondary video signal superimposed on the primary video signal in a picture-in-picture window in proximity to a plurality of directional indicators, and wherein processor is further configured to relocate the picture-in-picture window on the display when the two-dimensional input signal corresponds to one of the plurality of directional indicators.


Various other embodiments, aspects and other features are described in more detail below.





BRIEF DESCRIPTION OF THE DRAWING FIGURES

Exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and



FIG. 1 is a block diagram of an exemplary television receiver system;



FIG. 2 is a block diagram of an exemplary television receiver device;



FIGS. 3 and 4 are exemplary screen displays with several types of graphical interaction with PIP features; and



FIG. 5 is a flowchart of an exemplary process for presenting a PIP window.





DETAILED DESCRIPTION

The following detailed description of the invention is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.


Generally speaking, control of picture-in-picture (PIP) functionality can be substantially improved by allowing two-dimensional interaction with the PIP window. The PIP window may be moved, resized and/or otherwise modified, for example, by providing any number of arrow buttons or other directional indicators on the screen that can be “clicked” or otherwise actuated in response to two-dimensional inputs received from the viewer. By allowing for two-dimensional interaction with the PIP window, the convenience of the PIP feature is greatly improved while simplifying the viewer's interaction with the display.


Turning now to the drawing figures and with initial reference to FIG. 1, an exemplary system 100 for presenting television signals to a viewer suitably includes a receiver 108 that receives signals 105 in any format and generates appropriate outputs 107 to generate imagery 110 on display 102. Typically, receiver 108 interacts with a wireless remote control 112, which may include any sort of two-dimensional input device 124 for producing two-dimensional input signals 125 in response to viewer inputs.


Television imagery is presented on display 102 as desired by the viewer. In various embodiments, a PIP window 111 may be presented overlying a primary window 110 to allow simultaneous viewing of multiple programs. Further, two-dimensional navigation features (e.g., arrows 115-118) may be presented to allow the viewer to manipulate PIP window 111 through control of a cursor 114 or other interface feature via remote control 112. In various embodiments, cursor 114 is able to move in response to two-dimensional input signals 125, which are, in turn, generated in response to inputs applied to two-dimensional input device 124. By moving cursor 114 to interact with the two-dimensional navigation features presented on display 102, PIP window 111 may be moved, resized, re-aligned or otherwise manipulated as desired.


Receiver 108 is any component, device or logic capable of receiving and decoding video signals 105. In various embodiments, receiver 108 is a set-top box (STB) or the like capable of receiving satellite, cable, broadcast and/or other signals encoding audio/visual content. Receiver 108 may further demodulate or otherwise decode the received signals 105 to extract programming that can be locally viewed on display 102 as desired. Receiver 108 may also include a content database stored on a hard disk drive, memory, or other storage medium to support a digital or other personal video recorder (DVR/PVR) feature as appropriate. Receiver 108 may also provide place shifting, electronic program guide, multi-stream viewing and/or other features as appropriate.


In the exemplary embodiment illustrated in FIG. 1, receiver 108 is shown receiving digital broadcast satellite (DBS) signals 105 from a satellite 106 at an antenna 104. Equivalent embodiments, however, could receive programming 105 from one or more programming sources, including any sort of satellite, cable or broadcast source, as well as any Internet or other network source or the like. In embodiments that include DVR functionality, programming may be stored in any sort of database as desired (e.g., in response to user/viewer programming instructions) for subsequent viewing. Content may also be received from digital versatile disks (DVDs) or other removable media in some embodiments.


Display 102 is any device capable of presenting imagery to a viewer. In various embodiments, display 102 is a conventional television set, such as any sort of television operating in accordance with any digital or analog protocols, standards or other formats. Display 102 may be a conventional NTSC or PAL television receiver, for example. In other embodiments, display 102 is a monitor or other device that may not include built-in receiver functionality, but that is nevertheless capable of presenting imagery in response to signal 107 received from receiver 108. In various embodiments, receiver 108 and display 102 may be physically combined or interconnected in any manner. A receiver card, for example, could be inserted into a slot or other interface in a conventional television, or the functionality of receiver 108 may be provided within a conventional television display 102. In other embodiments, signals 107 are transferred between receiver 108 and display 102 using any sort of cable or other interface (including a wireless interface). Examples of common interfaces include, without limitation, component video, S-video, High-Definition Multimedia Interface (HDMI), Digital Visual Interface (DVI), IEEE 1394, and/or any other formats as desired.


Remote control 110 is any sort of control device capable providing signals 125 to receiver 108 that represent inputs received from one or more viewers. In various embodiments, remote control 110 is an infrared, radio frequency (RF) or other wireless remote that includes any number of buttons or other features for receiving viewer inputs. In an exemplary embodiment, remote control 110 communicates with receiver 108 using the IEEE 802.15.4 (“ZIGBEE”) protocol for wireless personal area networks (WPANs), although other embodiments may instead communicate using IEEE 802.15.1 (“BLUETOOTH”), IEEE 802.11 (“WI-FI”), conventional infrared, and/or any other wireless techniques.


In further embodiments, remote control 110 includes a two-dimensional input device 124 that is able to receive inputs from the user in any multi-dimensional format (e.g, “X,Y”, “r,Θ”, and/or the like). Examples of two-dimensional input devices 124 that could be used in various embodiments include, without limitation, touchpads, directional pads, joysticks, trackballs, sets of arrows or other buttons, and/or the like. In a typical implementation, two-dimensional input device 124 provides coordinates or other signals 125 that indicate absolute (e.g, “X,Y”) and/or relative (e.g., “ΔX, ΔY”) movement in two or more dimensions. Such signals 125 may be decoded at controller 108 or elsewhere to coordinate the viewer's actions with respect to input device 124 to movement of cursor 114 or other features presented on display 102.


In the exemplary embodiment shown in FIG. 1, remote control 110 is illustrated with a touchpad-type device 124 that accepts viewer inputs applied with a finger, stylus or other object. FIG. 1 also shows touchpad device 124 as having dedicated scroll regions 122 and 128 for vertical and horizontal scrolling, respectively. Viewer movements within region 122 that is more-or-less parallel to the right edge of device 124, for example, could result in vertical scrolling, whereas movements within region 128 that are more-or-less parallel to the bottom edge of device 124 could result in horizontal scrolling. Dedicated scrolling regions 122, 128 are optional features, however, that may not be present in all embodiments. Further, scrolling from a touchpad or other device 124 could be implemented in any other manner.


In operation, then, receiver 108 suitably receives television signals 105 from a satellite, cable, broadcast or other source. Signals 105 typically encompass multiple channels that can be simultaneously viewed. In a satellite based embodiment, for example, a primary channel and a secondary channel can be extracted from a common satellite feed. One or more cable or broadcast channels may also be obtained in any manner. In other embodiments, receiver 108 may obtain multiple channel signals from different sources (e.g., one channel from a cable or satellite source and another channel from a terrestrial broadcast, DVD or other source).


Receiver 108 suitably obtains the desired content from the channel(s) indicated by the viewer, and presents the content on display 102. In various embodiments, primary and secondary channels may be presented in a conventional PIP window 111, with the secondary channel superimposed upon (e.g, presented in a smaller window within) the imagery 110 obtained from the primary channel.


The viewer is able to interact with PIP window 111 in any manner. In various embodiments, the viewer is able to move a cursor or similar pointer 114 on display 102 using two-dimensional input device 124. By pointing to various interface features that are presented in association with PIP window 111, the viewer may be able to move, resize or otherwise adjust the window 111 as desired. The exemplary embodiment shown in FIG. 1, for example, shows four directional arrows 115, 116, 117 and 118 (corresponding to movement in left, right, up and down directions, respectively) that can be “clicked” or otherwise indicated to move window 111 in a desired direction. Although not specifically shown in FIG. 1, other icons could be present to allow resizing, reordering, swapping of primary/secondary content, and/or other features as desired. In still other embodiments, window 111 could be presented with a drag bar or other feature that would allow dragging of window 111 to a desired position within display 110, as described more fully below.



FIG. 2 provides additional detail about an exemplary receiver 108 that includes a receiver interface 208, a decoder 214 and a display processor 218, as appropriate. FIG. 2 also shows a disk controller interface 206 to a disk or other storage device 110, an interface 210 to a local or wide area network, a transport select module 212, a display interface 228, an RF receiver module and control logic 205. Other embodiments may incorporate additional or alternate processing modules from those shown in FIG. 2, may omit one or more modules shown in FIG. 2, and/or may differently organize the various modules in any other manner different from the exemplary arrangement shown in FIG. 2.


Receiver 108 may be physically and logically implemented in any manner. FIG. 2 shows various logical and functional features that may be present in an exemplary device; each module shown in the figure may be implemented with any sort of hardware, software, firmware and/or the like. Any of the various modules may be implemented with any sort of general or special purpose integrated circuitry, for example, such as any sort of microprocessor, microcontroller, digital signal processor, programmed array and/or the like. Any number of the modules shown in FIG. 2, for example, may be implemented as a “system on a chip” (SoC) using any suitable processing circuitry under control of any appropriate control logic 205. In various embodiments, control logic 205 executes within an integrated SoC or other processor that implements receiver interface 208, transport selector 212, decoder 214, display processor 218, disk controller 206 and/or other features, as appropriate. The Broadcom Corporation of Irvine, Calif., for example, produces several models of processors (e.g., the model BCM 7400 family of processors) that are capable of supporting SoC implementations of satellite and/or cable receiver systems, although products from any number of other suppliers could be equivalently used. In still other embodiments, various distinct chips, circuits or components may be inter-connected and inter-relate with each other to implement the receiving and decoding functions represented in FIG. 2.


Various embodiments of receiver 108 therefore include any number of appropriate modules for obtaining and processing media content as desired for the particular embodiment. Each of these modules may be implemented in any combination of hardware and/or software using logic executed within any number of semiconductor chips or other processing logic.


Various embodiments of control logic 205 can include any circuitry, components, hardware, software and/or firmware logic capable of controlling the various components of receiver 108. Various routines, methods and processes executed within receiver 108 are typically carried out under control of control logic 205, as described more fully below. Generally speaking, control logic 205 receives user input signals 125 (FIG. 1) via an RF receiver interface 232 that is able to communicate with the remote control 112 using a suitable antenna 234. Control logic receives user inputs from remote control 112 and/or any other source, and directs the other components of receiver 108 in response to the received inputs to present the desired imagery on display 102.


As noted above, many embodiments of receiver 108 include a receiver interface 208, which is any hardware, software, firmware and/or other logic capable of receiving media content via one or more content sources 105. In various embodiments, content sources 105 may include cable television, DBS, broadcast and/or other programming sources as appropriate. Receiver interface 208 appropriately selects a desired input source and provides the received content to an appropriate destination for further processing. In various embodiments, received programming may be provided in real-time (or near real-time) to a transport stream select module 212 or other component for immediate decoding and presentation to the user. Alternatively, receiver interface 208 may provide content received from any source to a disk or other storage medium in embodiments that provide DVR functionality. In such embodiments, receiver 108 may also include a disk controller module 206 that interacts with an internal or external hard disk, memory and/or other device that stores content in a database 110, as described above.


In the embodiment shown in FIG. 2, receiver 108 also includes an appropriate network interface 210, which operates using any implementation of protocols or other features to support communication by receiver 108 on network 102. In various embodiments, network interface 210 supports conventional LAN, WAN or other protocols (e.g., the TCP/IP or UDP/IP suite of protocols widely used on the Internet) to allow receiver 108 to communicate on the Internet or any other network as desired. Network interface 210 typically interfaces with the network using any sort of LAN adapter hardware, such as a conventional network interface card (NIC) or the like provided within receiver 108.


Transport stream select module 212 is any hardware and/or software logic capable of selecting a desired media stream from the available sources. In the embodiment shown in FIG. 2, stream select module 212 is able to generate video signals for presentation on one or more output interfaces 228. Typically, transport select module 212 responds to viewer inputs (e.g., via control logic 205) to simply switch encoded content received from a broadcast, satellite, cable or other source 105 or from storage 110 to one or more decoder modules 214.


Receiver 108 may include any number of decoder modules 214A-B for decoding, decompressing and/or otherwise processing received/stored content as desired. Generally speaking, decoder modules 214A-B decompress, decode and/or otherwise process received content from stream select module 212 to extract an MPEG or other media stream encoded within the stream. The decoded content can then be processed by one or more display processor modules 218 to create a presentation on display 102 (FIG. 1) for the viewer in any appropriate format. FIG. 2 shows two decoder modules 214A-B operating on two separate signals from transport select module 212. In practice, any number of decoder modules 214 may be used, particularly in PIP settings where multiple signals are simultaneously decoded and displayed. The term “decoder”, then, may collectively apply to one or more decoder modules that are able to decode one or more signals for presentation on display 104.


Display processor module 218 includes any appropriate hardware, software and/or other logic to create desired screen displays via display interface 228 as desired. Such displays may include combining signals received from one or more decoder modules 214A-B to facilitate viewing of one or more channels. In various embodiments, display processing module 218 is also able to produce on screen displays (OSDs) for electronic program guide, setup and control, input/output facilitation and/or other features that may vary from embodiment to embodiment. Such displays are not typically contained within the received or stored broadcast stream, but are nevertheless useful to users in interacting with receiver 108 or the like. The generated displays, including received/stored content and any other displays may then be presented to one or more output interfaces 228 in any desired format.


When the viewer requests a PIP window 111, for example, display processor 218 may be operable to receive the desired imagery from one or more decoder modules 214A-B and to create an image with the imagery from the secondary channel superimposed in PIP window 111 on the imagery 110 from the primary channel. Display processor 218 may also generate symbology such as cursor 114 and/or navigational features (e.g., arrows 115-118 in FIG. 1) making up a user interface that allows the viewer to adjust window 111 (or other features) as desired. As receiver 108 receives user inputs 125 from remote control 112, control logic 205 may direct display processor 218 to adjust window 111 or any other feature of imagery 110 as directed by the viewer. Display processor 218 therefore directs the presentation of PIP window 111 in conjunction with one or more navigation features, and adjusts the PIP window 111 in response to inputs received from the viewer.


Display processor 218 produces an output signal encoded in any standard format (e.g., ITU656 format for standard definition television signals or any format for high definition television signals) that can be readily converted to standard and/or high definition television signals at interface 228. In other embodiments, the functionality of display processor 218 and interface 228 may be combined in any manner.


Turning now to FIG. 3, an exemplary display 300 suitably includes a PIP window 111 superimposed upon primary imagery 110, as described above. Display 300 also shows navigational features including cursor 114, arrows 115-118, and icons 302, 304, 306 and 308. The viewer moves cursor 115 using, for example, the two dimensional input device 124 associated with remote control 112, and selects the navigational features using any sort of “select” or “enter” feature, such as a “select” key on remote control 112. In various embodiments, the select key functions similar to a conventional select button on a mouse, touchpad or other input device commonly associated with a personal computer or the like.


As noted above, clicking on any of the arrow features 115-118 can have the perceived effect of moving window 111 in the direction of the arrow. Clicking on the “right” arrow 116, for example, could move window 111 from its current position toward the rightmost side of display 300.


The exemplary embodiment of FIG. 3 also shows several icons 302, 304, 306, 308. As shown in FIG. 3, icon 302 can be selected to swap the contents of windows 110 and 111 so that the secondary content becomes the primary content, and vice versa. Icon 304 may be selected to create a “side-by-side” display in which both the primary and secondary content occupy roughly equal proportions of display 110. Icons 306 and 308 can be selected to increase and decrease, respectively, the size of window 111. Other embodiments may provide additional features and/or may omit certain features shown in FIG. 3. Moreover, any number of icons 302, 304, 306, 308 may be present in other embodiments, or icons may be omitted entirely in some embodiments.


Navigation features need not be presented on display 300 at all times. In various embodiments, certain navigational features may be activated or deactivated in response to viewer actions with remote control 112. Cursor 114, for example, may remain hidden until inputs are detected on input device 122 and/or remote control 112. Icons 302, 304, 306, 308 may be obscured until cursor 114 is positioned over or near window 111. Arrows 115-118 may similarly be obscured unless cursor 114 is in proximity to window 111. In various embodiments, arrows 115-118 may be displayed when cursor 114 is within a region 310. Region 310 may be any size or shape, and may coincide with window 111 in various embodiments. Other embodiments may obscure and/or display navigational features in response to different conditions, as desired.


In various further embodiments, some or all of the navigation features presented on display 300 may be altered as conditions warrant. FIG. 4, for example, shows an exemplary display 400 in which the PIP window 111 has moved into the upper right quadrant of the display, making further movement in the upward or rightward directions impractical. In such embodiments, arrows 116 and 117 (FIG. 3) have been removed to disallow further movement in those directions.



FIG. 4 also shows an alternate navigational feature for changing the position of window 111. In various embodiments, arrow features 115-118 are replaced or supplemented with a drag bar feature 402. Drag bar 402 is any bar, button or other feature capable of being selected (e.g., using cursor 114) and dragged to another position on display 400. Drag bar 402 may become displayed or otherwise activated when cursor 114 enters into region 310, as described above, or according to any other parameters. In other embodiments, drag bar 402 is simply a window header that remains relatively static when window 111 is presented on display 400.



FIG. 5 shows an exemplary process 500 for presenting a PIP window 111 on a display 102. In various embodiments, the various steps shown in FIG. 5 may be executed using source or object code in any format that may be stored in mass storage, firmware, memory or any other digital storage medium within receiver 104. Such code may be executed by any module or combination of modules operating within receiver 104. In an exemplary embodiment, some or all of the steps shown in process 500 are executed by a display processing module 218 (FIG. 2) operating alone or in conjunction with control logic 205 and/or the various other features shown in FIG. 2 and described above.


With reference now to FIG. 5, an exemplary process 500 suitably includes the broad steps of receiving two-dimensional or other inputs related to the PIP display (steps 502), processing the inputs (step 504) and then presenting window 111 by drawing or re-drawing window 111 as appropriate (step 510). Various other steps or features may be present as well in any number of alternate embodiments.


Process 500 suitably begins by receiver 104 receiving inputs 125 (step 502) from the viewer. In various embodiments, the received inputs are provided from remote control 112 to receiver via RF interface 232 and antenna 234, although other techniques may be used in other embodiments.


Inputs 125 that are relevant to PIP functionality may initially include any sort of indication that the viewer would like to view a PIP window 111; such an indication may be responsive to a “PIP” button on remote control 112, or to a selection of a menu feature generated on display 104. After PIP window 111 is displayed, subsequent inputs 125 may be received from remote control 112 that allow for moving, resizing or other manipulation of the PIP window, as described above. Such inputs 125 may include, for example, two dimensional inputs received from the two-dimensional input device 122 associated with remote control 112 to allow for directional movement, resizing, and/or the like.


Directional inputs 125 may be processed (step 504) in any manner. Control logic 205, for example, may process multi-dimensional inputs from input device 122 to extract and determine the viewer's intent for subsequent processing.


In various embodiments, one or more parameters may be checked prior to presenting (or re-drawing) PIP window 111. Such parameters may include a screen position of PIP window 111, for example, to ensure that sufficient display space is available for one or more directional features (e.g., arrows 115-118). Other embodiments may detect if cursor 114 is located within a region of interest (e.g, region 310 described above) to ascertain whether certain features should be provided on the display.


When the proper imagery is determined, window 111 may be drawn or redrawn on display 102 as appropriate (step 510). In various embodiments, window 111 is presented superimposed upon primary imagery 110, as described above. Further, various navigational features (e.g., cursor 114, arrows 115-118, icons 302-308, drag bar 402) may be drawn as desired, in accordance with any parameters and rules established in step 508. Window 111 and any associated navigational features may be redrawn in response to inputs subsequently received. In practice, then, after a PIP window 111 and any associated navigation features are presented on display 102 in a first iteration of step 510, subsequently received two-dimensional inputs may be further received and processed (steps 502-508) before re-drawing window 111 according to the newly-received inputs in a subsequent occurrence of step 510. The general logical and data flow of a practical embodiment may be modified from that shown in FIG. 5 in any manner; additional or alternate steps may be provided, and/or one or more steps may be omitted as appropriate.


Accordingly, new systems and techniques for graphically interacting with a PIP window 111 are described. Various embodiments allow for two-dimensional inputs provided from a touchpad or other input device 122 associated with a remote control 112 to interact with arrows, buttons and/or other navigational features to window 111 to be moved, resized or otherwise manipulated as desired by the viewer. Other embodiments may provide additional or alternate features as desired.


As used herein, the word “exemplary” means “serving as an example, instance, or illustration.” Any implementation described herein as exemplary is not necessarily to be construed as preferred or advantageous over other implementations.


While the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing various embodiments of the invention, it should be appreciated that the particular embodiments described above are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. To the contrary, various changes may be made in the function and arrangement of elements described without departing from the scope of the invention.

Claims
  • 1. A method for presenting a picture-in-picture window on a display, the method comprising: presenting the picture-in-picture window that provides first television programming at a first location within a larger window that simultaneously provides second television programming that is different from the first television programming on the display;while the picture-in-picture window is visible on the display, simultaneously presenting a graphical navigation feature proximate the picture-in-picture window on the display, wherein the graphical navigation feature has a distance from and a positioning with respect to the picture-in-picture window;receiving a two-dimensional input that selects the graphical navigation feature; andin response to the two-dimensional input selecting the graphical navigation feature, moving the picture-in-picture window from the first location to a second location within the larger window of the display as the picture-in-picture window presents the first television programming within the second television programming on the display, wherein, subsequent to said moving, the graphical navigation feature remains present on the display, further wherein the graphical navigation feature retains the distance from and the positioning with respect to the picture-in-picture window subsequent to said moving, and further wherein a portion of the graphical navigation feature is not displayed when the picture-in-picture window is moved so as to be abuttingly adjacent to an edge of the display.
  • 2. The method of claim 1 wherein the graphical navigation feature comprises a plurality of arrows located in proximity to the picture-in-picture window.
  • 3. The method of claim 2 wherein the two-dimensional input corresponds to a cursor interaction with one of the plurality of arrows.
  • 4. The method of claim 2 further comprising displaying the plurality of arrows only when a cursor is in proximity to the picture-in-picture window, and otherwise not displaying the plurality of arrows when the cursor is not in proximity to the picture-in-picture window.
  • 5. The method of claim 2 wherein the picture-in-picture window is presented superimposed upon a primary image on the display.
  • 6. The method of claim 5 further comprising displaying only a subset of the plurality of arrows when the picture-in-picture window is adjacent to an edge of the primary image.
  • 7. The method of claim 1 wherein the navigation feature comprises a drag feature located in proximity to the picture-in-picture window, and wherein the drag feature is selectable to drag the picture-in-picture window from the first position to the second location within the larger window of the display.
  • 8. The method of claim 7 wherein the two-dimensional input corresponds to a cursor interaction with the drag feature.
  • 9. The method of claim 1 wherein the receiving comprises receiving the two-dimensional input from a wireless remote control comprising a two-dimensional input device.
  • 10. The method of claim 9 wherein the two-dimensional input device is one of the group consisting of: a touchpad, a trackball, a joystick, a directional pad, and a plurality of directional keys.
  • 11. A video receiver comprising: a receiver interface configured to receive an incoming modulated signal;a decoder configured to decode the incoming modulated signal to obtain primary and secondary video signals;a wireless receiver configured to receive a two-dimensional input signal; anda processor configured to generate an output image comprising the video content contained in the primary video signal in a larger window and the video content contained in the secondary video signal in a picture-in-picture window positioned at a first location within the larger window, wherein the picture-in-picture window is displayed simultaneously with a graphical navigation feature, wherein the graphical navigation feature has a distance from and a positioning with respect to the picture-in-picture window, and wherein the processor is further configured to move the position of the picture-in-picture window from the first location to a different second location within the larger window as the picture-in-picture window is presented on the display when the two-dimensional input signal corresponds to the navigation feature, wherein, subsequent to said move, the graphical navigation feature remains present on the display, further wherein the graphical navigation feature retains the distance from and the positioning with respect to the picture-in-picture window subsequent to said move, and further wherein a portion of the graphical navigation feature is not displayed when the picture-in-picture window is moved so as to be abuttingly adjacent to an edge of the display.
  • 12. The video receiver of claim 11 wherein the receiver interface comprises a satellite interface.
  • 13. The video receiver of claim 11 wherein the wireless receiver is configured to receive the two-dimensional input signal from a wireless remote control comprising a two-dimensional input device.
  • 14. The video receiver of claim 13 wherein the two-dimensional input device is one of the group consisting of: a touchpad, a trackball, a joystick, a directional pad, and a plurality of directional keys.
  • 15. The video receiver of claim 11 wherein the graphical navigation feature comprises a plurality of arrows presented in proximity to the picture-in-picture window.
  • 16. The video receiver of claim 15 wherein the two-dimensional input corresponds to a cursor interaction with one of the plurality of arrows.
  • 17. The video receiver of claim 11 wherein the graphical navigation feature comprises a drag feature located in proximity to the picture-in-picture window.
  • 18. The video receiver of claim 17 wherein the two-dimensional input corresponds to a cursor interaction with the drag feature.
  • 19. A system for presenting television content on a display, the system comprising: a wireless remote control comprising a two-dimensional input device configured to provide a two-dimensional input signal in response to a user input; anda video receiver comprising: a receiver interface configured to receive an incoming modulated signal;a decoder configured to decode the incoming modulated signal to obtain primary and secondary video signals;a wireless receiver configured to receive the two-dimensional input signal from the wireless remote control; anda processor configured to generate an output image to be presented on the display, wherein the output image comprises the video content contained in the secondary video signal in a picture-in-picture window superimposed on a larger window that provides the video content contained in the primary video signal, wherein the picture-in-picture window is initially located at a first position within the larger window in proximity to a plurality of simultaneously-displayed graphical directional indicators, and wherein the processor is further configured to relocate the picture-in-picture window from the first position to a second position within the larger window that is different from the first position as the picture-in-picture window is presented on the display when the two-dimensional input signal corresponds to one of the plurality of directional indicators, andwherein the plurality of directional indicators comprise:1) first, second, third, and fourth arrows positioned adjacently to the left, to the right, above, and below the picture-in-picture window, respectively, wherein when the two-dimensional input signal corresponds to the first arrow the picture is relocated leftward, when the two-dimensional input signal corresponds to the second arrow the picture is relocated rightward, when the two-dimensional input signal corresponds to the third arrow the picture is relocated upward, and when the two-dimensional input signal corresponds to the fourth arrow the picture is relocated downward, and2) a drag feature located in proximity to the picture-in-picture window, and wherein the drag feature is selectable to drag the picture-in-picture window from the first position to the second location within the larger window of the display.
  • 20. The system of claim 19, wherein the processor is further configured to display the first, second, third and fourth arrows and the drag feature only when a cursor is in proximity to the picture-in-picture window, and otherwise not display the first, second, third and fourth arrows or the drag feature when the cursor is not in proximity to the picture-in-picture window.
US Referenced Citations (227)
Number Name Date Kind
4450477 Lovett May 1984 A
4725888 Hakamada Feb 1988 A
4739510 Jeffers et al. Apr 1988 A
4852019 Vinberg et al. Jul 1989 A
4868785 Jordan et al. Sep 1989 A
5187776 Yanker Feb 1993 A
5260778 Kauffman et al. Nov 1993 A
5428734 Haynes et al. Jun 1995 A
5438372 Tsumori et al. Aug 1995 A
5450536 Rusenberg et al. Sep 1995 A
5453796 Duffield et al. Sep 1995 A
5539478 Bertram et al. Jul 1996 A
5539479 Bertram Jul 1996 A
5545857 Lee et al. Aug 1996 A
5548340 Bertram Aug 1996 A
5559961 Blonder Sep 1996 A
5585866 Miller et al. Dec 1996 A
5589893 Gaughan et al. Dec 1996 A
5594469 Freeman et al. Jan 1997 A
5594509 Florin et al. Jan 1997 A
5602597 Bertram Feb 1997 A
5604544 Bertram Feb 1997 A
5606374 Bertram Feb 1997 A
5650827 Tsumori et al. Jul 1997 A
5652630 Bertram et al. Jul 1997 A
5659369 Imaiida Aug 1997 A
5677708 Matthews, III et al. Oct 1997 A
5682489 Harrow et al. Oct 1997 A
5721815 Ottesen et al. Feb 1998 A
5721878 Ottesen et al. Feb 1998 A
5742286 Kung et al. Apr 1998 A
5751883 Ottesen et al. May 1998 A
5754258 Hanaya et al. May 1998 A
5767840 Selker Jun 1998 A
5768158 Adler et al. Jun 1998 A
5774186 Brodsky et al. Jun 1998 A
5786805 Barry Jul 1998 A
5801747 Bedard Sep 1998 A
5805235 Bedard Sep 1998 A
5809265 Blair et al. Sep 1998 A
5815216 Suh Sep 1998 A
5825361 Rubin et al. Oct 1998 A
5831591 Suh Nov 1998 A
5831607 Brooks Nov 1998 A
5867162 O'Leary et al. Feb 1999 A
5874953 Webster et al. Feb 1999 A
5898431 Webster et al. Apr 1999 A
5905496 Lau et al. May 1999 A
5917488 Anderson et al. Jun 1999 A
5917489 Thurlow et al. Jun 1999 A
5936623 Amro Aug 1999 A
5949417 Calder Sep 1999 A
5956025 Goulden et al. Sep 1999 A
5966121 Hubbell et al. Oct 1999 A
5978043 Blonstein et al. Nov 1999 A
5999228 Matsuura et al. Dec 1999 A
6005565 Legall Dec 1999 A
6008735 Chiloyan et al. Dec 1999 A
6008860 Patton et al. Dec 1999 A
6018342 Bristor Jan 2000 A
6020930 Legrand Feb 2000 A
6052121 Webster et al. Apr 2000 A
6057841 Thurlow et al. May 2000 A
6064376 Berezowski et al. May 2000 A
6078308 Rosenberg et al. Jun 2000 A
6088029 Guiberson et al. Jul 2000 A
6118442 Tanigawa Sep 2000 A
6118498 Reitmeier Sep 2000 A
6125374 Terry et al. Sep 2000 A
6141003 Chor et al. Oct 2000 A
6147714 Terasawa et al. Nov 2000 A
6173112 Gruse et al. Jan 2001 B1
6191773 Maruno et al. Feb 2001 B1
6208341 van Ee et al. Mar 2001 B1
6208804 Ottesen et al. Mar 2001 B1
6215417 Krass et al. Apr 2001 B1
6266098 Cove et al. Jul 2001 B1
6334217 Kim Dec 2001 B1
6493036 Fernandez Dec 2002 B1
6498628 Iwamura Dec 2002 B2
6526577 Knudson et al. Feb 2003 B1
6529685 Ottesen et al. Mar 2003 B2
6556252 Kim Apr 2003 B1
6650248 O'Donnell et al. Nov 2003 B1
6678009 Kahn Jan 2004 B2
6697123 Janevski et al. Feb 2004 B2
6750803 Yates et al. Jun 2004 B2
6750887 Kellerman et al. Jun 2004 B1
6774914 Benayoun Aug 2004 B1
6804824 Potrebic et al. Oct 2004 B1
6822698 Clapper Nov 2004 B2
6882712 Iggulden et al. Apr 2005 B1
6934963 Reynolds et al. Aug 2005 B1
6943845 Mizutome et al. Sep 2005 B2
7046161 Hayes May 2006 B2
7061544 Nonomura et al. Jun 2006 B1
7148909 Yui Dec 2006 B2
7171622 Bhogal Jan 2007 B2
7196733 Aratani et al. Mar 2007 B2
7206029 Cohen-Solal Apr 2007 B2
7225456 Kitsukawa et al. May 2007 B2
7231603 Matsumoto Jun 2007 B2
7268830 Lee Sep 2007 B2
7370284 Andrea et al. May 2008 B2
7420620 Habas et al. Sep 2008 B2
7434246 Florence Oct 2008 B2
7440036 Onomatsu et al. Oct 2008 B2
7584492 Terakado et al. Sep 2009 B2
7600201 Endler et al. Oct 2009 B2
7636131 Hsieh et al. Dec 2009 B2
7707599 Groff et al. Apr 2010 B1
7746332 Le Leannec et al. Jun 2010 B2
7876382 Imaizumi Jan 2011 B2
7880813 Nakamura et al. Feb 2011 B2
8001566 Jang Aug 2011 B2
8005826 Sahami et al. Aug 2011 B1
8239784 Hotelling et al. Aug 2012 B2
8487874 Lin Jul 2013 B2
8928669 Kobayashi Jan 2015 B2
20010011953 Shintani et al. Aug 2001 A1
20010017672 Verhaeghe Aug 2001 A1
20010042245 Iwamura Nov 2001 A1
20020054062 Gerba et al. May 2002 A1
20020057382 Yui May 2002 A1
20020060754 Takeuchi May 2002 A1
20020070957 Trajkovic et al. Jun 2002 A1
20020075333 Dutta et al. Jun 2002 A1
20020075407 Cohen-Solal Jun 2002 A1
20020097229 Rose et al. Jul 2002 A1
20020109669 Ha Aug 2002 A1
20020122027 Kim Sep 2002 A1
20020122079 Kamen et al. Sep 2002 A1
20020178446 Sie et al. Nov 2002 A1
20020191954 Beach Dec 2002 A1
20030001908 Cohen-solal Jan 2003 A1
20030005443 Axelsson et al. Jan 2003 A1
20030005445 Schein et al. Jan 2003 A1
20030018973 Thompson Jan 2003 A1
20030025716 Colavin Feb 2003 A1
20030066079 Suga Apr 2003 A1
20030086694 Davidsson May 2003 A1
20030115589 D'Souza et al. Jun 2003 A1
20030126607 Phillips et al. Jul 2003 A1
20030131356 Proehl et al. Jul 2003 A1
20030191947 Stubblefield et al. Oct 2003 A1
20030193426 Vidal Oct 2003 A1
20030208751 Kim et al. Nov 2003 A1
20040041723 Shibamiya et al. Mar 2004 A1
20040070593 Neely et al. Apr 2004 A1
20040075770 Lee Apr 2004 A1
20040111744 Bae et al. Jun 2004 A1
20040168191 Jerding et al. Aug 2004 A1
20040172651 Wasilewski et al. Sep 2004 A1
20040201780 Kim Oct 2004 A1
20040218905 Green et al. Nov 2004 A1
20040230843 Jansen Nov 2004 A1
20040255336 Logan et al. Dec 2004 A1
20050002649 Boyle et al. Jan 2005 A1
20050010949 Ward et al. Jan 2005 A1
20050084233 Fujii et al. Apr 2005 A1
20050128366 Cha Jun 2005 A1
20050188402 de Andrade et al. Aug 2005 A1
20050251826 Orr Nov 2005 A1
20050268100 Gasparini et al. Dec 2005 A1
20060037047 DeYonker et al. Feb 2006 A1
20060051058 Rudolph et al. Mar 2006 A1
20060061688 Choi Mar 2006 A1
20060084409 Ghadiali Apr 2006 A1
20060095401 Krikorian et al. May 2006 A1
20060197920 Furui Sep 2006 A1
20060236342 Kunkel et al. Oct 2006 A1
20070019111 Won Jan 2007 A1
20070039019 Collier Feb 2007 A1
20070039020 Cansler, Jr. et al. Feb 2007 A1
20070052851 Ochs Mar 2007 A1
20070061724 Slothouber et al. Mar 2007 A1
20070074254 Sloo Mar 2007 A1
20070079334 Silver Apr 2007 A1
20070115391 Anderson May 2007 A1
20070130607 Thissen et al. Jun 2007 A1
20070136681 Miller Jun 2007 A1
20070192791 Sullivan et al. Aug 2007 A1
20070195197 Seong et al. Aug 2007 A1
20070199022 Moshiri et al. Aug 2007 A1
20070266397 Lin Nov 2007 A1
20070275762 Aaltone Nov 2007 A1
20070277224 Osborn et al. Nov 2007 A1
20080010518 Jiang et al. Jan 2008 A1
20080016209 VanHarlingen Jan 2008 A1
20080024682 Chen Jan 2008 A1
20080034314 Louch et al. Feb 2008 A1
20080052245 Love Feb 2008 A1
20080066102 Abraham et al. Mar 2008 A1
20080074550 Park Mar 2008 A1
20080088495 Kawakita Apr 2008 A1
20080129886 Ishihara Jun 2008 A1
20080147803 Krzyzanowski et al. Jun 2008 A1
20080184324 Yun et al. Jul 2008 A1
20080222523 Fox et al. Sep 2008 A1
20080229254 Warner Sep 2008 A1
20080231762 Hardacker et al. Sep 2008 A1
20080235735 Wroblewski Sep 2008 A1
20080263595 Sumiyoshi et al. Oct 2008 A1
20090007209 Kawai Jan 2009 A1
20090031335 Hendricks et al. Jan 2009 A1
20090031343 Sharkey Jan 2009 A1
20090141024 Lee Jun 2009 A1
20090241145 Sharma Sep 2009 A1
20090307731 Beyabani Dec 2009 A1
20100037271 Crowe Feb 2010 A1
20100050199 Kennedy Feb 2010 A1
20100071004 Wightman Mar 2010 A1
20100074592 Taxier et al. Mar 2010 A1
20100077432 VanDuyn et al. Mar 2010 A1
20100079681 Coburn et al. Apr 2010 A1
20100083309 White et al. Apr 2010 A1
20100083310 VanDuyn et al. Apr 2010 A1
20100083312 White et al. Apr 2010 A1
20100083313 White et al. Apr 2010 A1
20100083319 Martch et al. Apr 2010 A1
20100088331 White et al. Apr 2010 A1
20100100909 Arsenault et al. Apr 2010 A1
20100115550 Minnick et al. May 2010 A1
20100169958 Werner et al. Jul 2010 A1
20100194689 Gong Aug 2010 A1
20100245557 Luley, III Sep 2010 A1
20120001942 Abe Jan 2012 A1
Foreign Referenced Citations (12)
Number Date Country
1063797 Dec 2000 EP
1158793 Nov 2001 EP
0001142 Jan 2000 WO
0145395 Jun 2001 WO
0178054 Oct 2001 WO
0178383 Oct 2001 WO
02087243 Oct 2002 WO
03043320 May 2003 WO
2006119269 Nov 2006 WO
2006127211 Nov 2006 WO
2007015047 Feb 2007 WO
2008013350 Jan 2008 WO
Non-Patent Literature Citations (59)
Entry
The International Bureau of WIPO “International Preliminary Report on Patentability” mailed Apr. 14, 2011; International Appln. No. PCT/US2009/058236, filed Sep. 24, 2009.
USPTO “Final Office Action” mailed May 13, 2011; U.S. Appl. No. 12/242,587, filed Sep. 30, 2008.
European Patent Office, International Searching Authority, “International Search Report” mailed Nov. 10, 2009; International Appln. No. PCT/EP2009/061499.
USPTO “Non-Final Office Action” mailed Jan. 31, 2011; U.S. Appl. No. 12/233,274, filed Sep. 18, 2008.
Wikipedia, the free encyclopedia, “Dashboard (Software),” Retrieved from the Internet on Oct. 6, 2008, http://en.wikipedia.org/w/index.php?title=Dashboard—(software)&printable=yes.
Nintendo, “Wii Operations Manual System Setup,” 2007.
USPTO “Non-Final Office Action” mailed Dec. 21, 2010; U.S. Appl. No. 12/235,476, filed Sep. 22, 2008.
USPTO “Non-Final Office Action” mailed Mar. 31, 2011; U.S. Appl. No. 12/241,556, filed Sep. 30, 2008.
International Searching Authority, European Patent Office, “International Search Report,” mailed Feb. 4, 2010; International Application No. PCT/US2009/058937, filed Sep. 30, 2009.
International Searching Authority, European Patent Office, “International Search Report,” mailed Feb. 16, 2010; International Application No. PCT/US2009/057582, filed Sep. 18, 2009.
Wightman, Robert Edward “Methods and Apparatus for Providing Multiple Channel Recall on a Television Receiver,” U.S. Appl. No. 12/233,274, filed Sep. 18, 2008.
White, James Matthew et al. “Systems and Methods for Configuration of a Remote Control Device,” U.S. Appl. No. 12/241,550, filed Sep. 30, 2008.
White, James Matthew et al. “Systems and Methods for Graphical control of User Interface Features Provided by a Television Receiver,” U.S. Appl. No. 12/241,556, filed Sep. 30, 2008.
Minnick, Danny Jean et al., “Graphical Interface Navigation Based on Image Element Proximity,” U.S. Appl. No. 12/609,860, filed Oct. 30, 2009.
White, James Matthew et al. “Systems and Methods for Providing Customer Service Features Via a Graphical User Interface in a Television Receiver,” U.S. Appl. No. 12/241,580, filed Sep. 30, 2008.
Martch, Henry Gregg “Systems and Methods for Automatic Configuration of a Remote Control Device,” U.S. Appl. No. 12/242,089, filed Sep. 30, 2008.
White, James Matthew et al. “Systems and Methods for Graphical Control of User Interface Features in a Television Receiver,” U.S. Appl. No. 12/241,599, filed Sep. 30, 2008.
Coburn, Matthew et al. “Systems and Methods for Graphical Control of Symbol-Based Features in a Television Receiver,” U.S. Appl. No. 12/241,604, filed Sep. 30, 2008.
White, James Matthew et al. “Systems and Methods for Graphical Adjustment of an Electronic Program Guide,” U.S. Appl. No. 12/241,608, filed Sep. 30, 2008.
Vanduyn, Luke et al. “Methods and Apparatus for Presenting Supplemental Information in an Electronic Programming Guide,” U.S. Appl. No. 12/235,476, filed Sep. 22, 2008.
Vanduyn, Luke et al. “Methods and Apparatus for Providing Multiple Channel Recall on a Television Receiver,” U.S. Appl. No. 12/242,587, filed Sep. 30, 2008.
Taxier, Karen Michelle et al. “Methods and Apparatus for Visually Displaying Recording Timer Information,” U.S. Appl. No. 12/235,464, filed Sep. 22, 2008.
Martch, Henry Gregg et al. “Methods and Apparatus for Locating Content in an Electronic Programming Guide,” U.S. Appl. No. 12/242,614, filed Oct. 17, 2008.
Taxier, Karen Michelle et al. “Apparatus and Methods for Dynamic Pictorial Image Authentication,” U.S. Appl. No. 12/236,430, filed Sep. 23, 2008.
USPTO “Non-Final Office Action” mailed Nov. 24, 2010; U.S. Appl. No. 12/242,587, filed Sep. 30, 2008.
USPTO “Non-Final Office Action” mailed Jan. 12, 2011; U.S. Appl. No. 12/241,580, filed Sep. 30, 2008.
USPTO “Non-Final Office Action” mailed Jan. 28, 2011; U.S. Appl. No. 12/236,430, filed Sep. 23, 2008.
USPTO “Non-Final office Action” mailed Feb. 4, 2011; U.S. Appl. No. 12/241,599, filed Sep. 30, 2008.
USPTO “Non-Final Office Action” mailed Feb. 9, 2011; U.S. Appl. No. 12/241,608, filed Sep. 30, 2008.
International Searching Authority, European Patent Office, “International Search Report,” mailed Dec. 7, 2009; International Application No. PCT/US2009/058457, filed Sep. 25, 2009.
International Searching Authority, European Patent Office, “International Search Report and Written Opinion,” mailed Dec. 18, 2009; International Application No. PCT/US2009/058456, filed Sep. 25, 2009.
International Searching Authority, European Patent Office, “International Search Report and Written Opinion,” mailed Dec. 21, 2009; International Application No. PCT/US2009/058454 filed Sep. 25, 2009.
Anonymous “ZigBee,” Wikipedia, the Free Encyclopedia [online], Sep. 26, 2008, XP002558439; retrieved from the Internet: <URL:http://en.wikipedia.org/w/index.php?title=ZigBee&oldid=241085798> [retrieved on Dec. 2, 2009].
International Searching Authority, European Patent Office, Annex to Form PCT/ISA/206, Communication Relating to the Results of the Partial International Search, mailed Nov. 16, 2009; International Application No. PCT/US2009/057825, filed Sep. 22, 2009.
United States Patent and Trademark Office, Final Office Action for U.S. Appl. No. 12/241,599, dated Aug. 26, 2011.
USPTO “Final Office Action” mailed Jun. 23, 2011; U.S. Appl. No. 12/241,580, filed Sep. 30, 2008.
USPTO “Non-Final Office Action” mailed Jul. 12, 2011; U.S. Appl. No. 12/241,604, filed Sep. 30, 2008.
USPTO “Final Office Action” mailed May 13, 2011; U.S. Appl. No. 12/235,476, filed Sep. 22, 2008.
USPTO “Final Office Action” mailed Jul. 28, 2011; U.S. Appl. No. 12/241,608, filed Sep. 30, 2008.
USPTO “Final Office Action” mailed Aug. 18, 2011; U.S. Appl. No. 12/233,274, filed Sep. 18, 2008.
USPTO “Non-Final Office Action” mailed Nov. 23, 2011; U.S. Appl. No. 12/242,614, filed Sep. 30, 2008.
USPTO “Final Office Action” mailed Jan. 20, 2012; U.S. Appl. No. 12/241,604, filed Sep. 30, 2008.
USPTO “Non-Final Office Action” mailed Dec. 6, 2011; U.S. Appl. No. 12/241,580, filed Sep. 30, 2008.
USPTO “Final Office Action” mailed Dec. 7, 2011; U.S. Appl. No. 12/241,599, filed Sep. 30, 2008.
USPTO “Non-Final Office Action” mailed Mar. 7, 2012; U.S. Appl. No. 12/235,464, filed Sep. 22, 2008.
USPTO “Non-Final Office Action” mailed Mar. 22, 2012; U.S. Appl. No. 12/241,556, filed Sep. 30, 2008.
USPTO “Non-Final Office Action” mailed Apr. 17, 2012; U.S. Appl. No. 12/241,608, filed Sep. 30, 2008.
USPTO “Final Office Action” mailed Apr. 25, 2012; U.S. Appl. No. 12/242,614, filed Sep. 30, 2008.
USPTO “Non-Final Office Action” mailed Apr. 24, 2012; U.S. Appl. No. 12/235,476, filed Sep. 22, 2008.
USPTO “Final Office Action” mailed Oct. 5, 2011; U.S. Appl. No. 12/241,556, filed Sep. 30, 2008.
USPTO “Final Office Action” mailed Sep. 14, 2012 for U.S. Appl. No. 12/242,587, filed Sep. 30, 2008.
USPTO “Final Office Aciton” mailed Oct. 9, 2012 for U.S. Appl. No. 12/235,464, filed Sep. 22, 2008.
USPTO “Final Office Action” mailed Aug. 8, 2012 for U.S. Appl. No. 12/241,556, filed Sep. 30, 2008.
USPTO “Final Office Action” mailed Jul. 17, 2012 for U.S. Appl. No. 12/241,580, filed Sep. 30, 2008.
USPTO “Non-Final Office Action” mailed Jul. 26, 2012 for U.S. Appl. No. 12/609,860, filed Oct. 30, 2009.
USPTO “Final Office Action” mailed Aug. 1, 2012 for U.S. Appl. No. 12/241,608, filed Sep. 30, 2008.
USPTO “Final Office Action” mailed Aug. 2, 2012 for U.S. Appl. No. 12/241,599, filed Sep. 30, 2008.
USPTO “Final Office Action” mailed Aug. 9, 2012 for U.S. Appl. No. 12/235,476, filed Sep. 22, 2008.
United States Patent and Trademark Office, Office Action for U.S. Appl. No. 12/242,587, dated Jun. 5, 2012.
Related Publications (1)
Number Date Country
20100079671 A1 Apr 2010 US