Various embodiments relate to user interfaces for media devices, as well as mechanisms to control media devices that control other media devices.
Many media devices have been developed to provide different functionalities. A typical home theater system, for example, includes multiple media devices such as a DVD player, a timeshifting device (e.g., digital video recorder (DVR)), and a place-shifting device (e.g., Sling-Box from Sling Media, Inc.). In order to control these various media devices, customers typically juggle or switch between multiple remote controls, which can be inconvenient and confusing.
One approach to solve this problem is to use a universal remote control, which is typically a remote control that has been preprogrammed to operate a variety of devices in parallel. When a user wants to control a device, the user may press a mode button on the universal remote control to switch the remote to the intended device, and then use the universal remote control to control the device. When the user wants to control a different device, he or she generally needs to switch the universal remote control to that other device the universal remote controlling the other device.
This approach is insufficient for many applications because it does not typically allow for seamless operation. Users must generally first figure out which device to control before using the universal remote control, which can be confusing when multiple devices are involved.
In addition, this approach does not typically function when the multiple devices are controlled through one another. For example, assume a local device (e.g., a set-top box) connects to and controls a remote device (e.g., a place-shifting device). Users of the universal remote control cannot generally control both devices easily, because the remote device is controlled through the local one. The universal remote control is generally designed to control multiple devices in parallel, by operating with each of them individually and directly; as a result, the universal remote does not typically work optimally in setting where the remote is asked to control one device through another.
Therefore, there is a need in the art for a way to enable users to control multiple devices seamlessly, including remote, down-stream devices. There is also a need for an efficient and intuitive user interface.
Various exemplary embodiments relate to systems and methods for processing user inputs received from a remote control. In a first exemplary embodiment, a method of processing an input received from a user via a remote control suitably comprises presenting a media stream on a display and receiving the input from the remote control at a local device associated with the display. The local device determines if the input is intended for the local device or a remote device. If the input is intended for the local device, the input is processed at the local device. If the input is intended for the remote device, a signal is transmitted from the local device to the remote device to thereby allow the remote device to respond to the input.
In another embodiment, a method of processing an input received from a user via a remote control suitably comprises presenting first imagery on a display, wherein the first imagery comprises a presentation of a media stream having a periphery and an interface feature comprising a first plurality of input options, and wherein the first plurality of input options is arranged around the periphery of the presentation of the media stream on the display. The input is received from the remote control at a local device associated with the display, wherein the input is a directional input having a direction corresponding to one of the first plurality of input options. In response to the directional input, second imagery is presented on the display, wherein the second imagery comprises a second plurality of input options that is presented adjacent to the presentation of the media stream on the display in a direction from the presentation of the media stream that corresponds to the direction of the input.
In still other embodiments, a system for processing an input received via a remote control from a viewer of a display comprises a wireless receiver configured to receive the input from the remote control, a network interface configured to be coupled to a network, a display interface configured to be coupled to the display; and a processor. The processor is configured to receive the input from the wireless receiver, to receive a media stream from a remote device via the network interface, to present imagery comprising the media stream on the display via the display interface, to process the input at the local device if the input is intended for the local device, and to transmit a signal to the remote device via the network to thereby allow the remote device to respond to the input if the input is intended for the remote device.
The disclosed embodiments have other advantages and features which will be more readily apparent from the detailed description, the appended claims, and the accompanying drawings, in which:
Alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
Embodiments of the present disclosure provide a user interface for users to interact with a media device, as well as a method for users to control multiple devices through a single device. The user interface may be implemented on any sort of device, including any sort of media player, time or place shifting device, receiver, recorder, set-top box or the like. While some embodiments may implement both the interfaces and the control features described herein for a high level of functionality and convenience, it is not necessary that embodiments of the user interface include the ability to control multiple devices, nor is it necessary that multi-device embodiments make use of the user interface features described herein.
System Architecture of an Exemplary Embodiment
In
The media device 110 may receive media content from the place-shifting device 130, the personal computer 140, the Internet 150, a cable, satellite or broadcast television receiver, any remote media devices that may be present, and/or any other sources as appropriate. In the example of
Media device 110 may be logically and physically implemented in any manner.
Various embodiments of control logic 205 can include any circuitry, components, hardware, software and/or firmware logic capable of controlling the components and processes operating within device 110. Although
Media device 110 includes an appropriate network interface 210 that operates using any implementation of protocols or other features to support communication by device 110 on the network. In various embodiments, network interface 210 supports conventional LAN, WAN or other protocols (e.g., the TCP/IP or UDP/IP suite of protocols widely used on the Internet) to allow device 110 to communicate on the network as desired. Network interface 210 typically interfaces with the network using any sort of LAN adapter hardware, such as a conventional network interface card (NIC) or the like provided within device 110.
Storage interface 206 is any physical, logical and/or other features that can be used to interface with an internal or external storage medium 215 such as a magnetic or optical disk drive, a flash memory card, and/or any other sort of storage as appropriate. In various embodiments, storage interface 206 is a universal serial bus (USB) or other standard interface that allows users to store files at a conventional computer system (e.g., computer 114 in some embodiments) for playback via media device 110. In such embodiments, media device 110 will typically include a physical interface that can receive the media 106, as well as a logical interface that may be implemented within the SoC or other logical features of device 110 to execute in response to control logic 205.
In many embodiments, media device 110 includes an input interface 207 that receives infrared or other wireless instructions from remote control 160. Input interface 207 may also include any number of buttons, sliders, knobs or other physical input devices located on a housing of device 110. In operation, user instructions provided by remote control 160 and/or any other input features are received at input interface 207 for subsequent processing by control logic 205. In various embodiments, control logic 205 takes appropriate actions based upon the particular inputs received; examples of appropriate actions may include directing display processor 218 to generate or modify the presented imagery, directing a command packet to be sent to a remotely-located content source, and/or any other actions.
Transport stream select module 212 is any hardware and/or software logic capable of selecting a desired media stream from the available sources. In the embodiment shown in
Display processor module 218 includes any appropriate hardware, software and/or other logic to create desired screen displays at interface 228 as desired. In various embodiments, display processor module 218 is able to decode and/or transcode the received media to a format that can be presented at display interface 228. The generated displays, including received/stored content and any other displays may then be presented to one or more output interfaces 228 in any desired format. In various embodiments, display processor 218 produces an output signal encoded in any standard format (e.g., ITU656 format for standard definition television signals or any format for high definition television signals) that can be readily converted to standard and/or high definition television signals at interface 228.
Display processing module 218 is also able to produce on screen displays (OSDs) for electronic program guide, setup and control, input/output facilitation and/or other features that may vary from embodiment to embodiment. Such displays are not typically contained within the received or stored broadcast stream, but are nevertheless useful to users in interacting with device 110 or the like. In particular, on-screen displays can be used to generate user interfaces that allow convenient program selection, control and the like, as described more fully below.
In operation, then, the user selects desired media content from a network source (e.g., placeshifting device 130, computer 140, or any other component shown in
Additionally, in various embodiments, media device 110 is also able to transmit control information to a remotely-located media source via network. As user instructions are received from remote control 160, for example, control logic 205 or another feature within media device 110 may formulate a command request message that is transmitted over the network for executing at the remote media source 115 to change the media stream provided by that remote media source 115.
User Interface
The media device 110 illustrated in
In one embodiment, users may access a top-level user interface (or top-level menu) by pressing a menu button on the remote control 160 or other input device. The remote control 160 sends a signal (or control signal) to the media device 110 indicating that the user pressed the menu button. Upon receiving the signal, the media device 110 displays the top-level user interface in the display 120.
Prior to receiving the signal, media device 110 may be in a stand-by mode or may be actively displaying a live or other video on the display 120, as appropriate. In various embodiments, if the display 120 is displaying video (e.g., live video or any other imagery) when the media device 110 receives the signal, the media device 110 suitably scales the video down and displays it in a video window in the top-level user interface, allowing the user to continue watching the video while navigating through the user interface.
In one embodiment, the top-level user interface includes up to four items (or menu items or controls or options) spatially arranged in a compass-type layout around the video window. Users may select (or access) an item of interest by navigating (or gesturing) to the direction indicated by the spatial location of the item. For example, if the user intends to select the item displayed above the video window, he or she may gesture “up” using the buttons on the remote control 160 (e.g., pressing an “up” arrow or a “direction up” button). Alternatively, the gesture may be provided using a touchpad, joystick, trackball, directional pad or other two-dimensional input device provided on remote control 160. In other embodiments, the top-level user interface may include more than four items spatially arranged around the video window. For example, the top-level user interface may include eight menu items displayed on the four sides (up, down, left, right) and the four corners (top-left, bottom-left, top-right, bottom-right). Other spatial arrangements (e.g., using any sort of row/column, circular, octagonal, or other arrangement) could also be formulated.
Upon receiving a user selection of an item displayed in the top-level user interface (a top-level item), the media device no suitably responds in any appropriate manner. In various embodiments, the media device 110 moves the video window in the opposite direction of user navigation and displays a second-level user interface associated with the selected top-level item in the rest display area in the display 120. For example, if the user selects the top-level item displayed on the left of the video window, the media device no shifts the video window to the right, and displays a second-level user interface to the left of the video window. Other embodiments may take other actions as appropriate.
The second-level user interface, similar to the first-level user interface, may be spatially arranged in a compass-type layout or the like. In one embodiment, the second level user interface includes up to three items spatially arranged together with the video window in a compass-type layout. Users may select an item of interest (including the video window) by navigating to the direction indicated by the spatial location of the item. Users may access the top-level user interface by selecting the video window, by pressing a home button on the remote control 160, or by any other technique. By displaying the video window in the opposite direction of user navigation, users may go back to the top-level user interface by navigating to the direction opposite to the previous navigation. For example, if the user navigated left from the top-level user interface to a second-level user interface, he or she can go back to the top-level user interface by navigating right in the second-level user interface.
In one embodiment, the second-level user interface includes an icon (or text or image) identifying the associated top-level item by displaying the icon in (or approximate to) the center, surrounded by the items displayed in the second-level user interface (second level items). This icon indicates to the user the path the user took to reach the current user interface.
Upon user selection of a second-level item in this embodiment, the media device 110 moves the video window in the opposite direction of user navigation and displays a third-level user interface associated with the selected second-level item in the rest display area in the display 120. The third-level user interface and deeper levels of user interface may be displayed and/or interacted in a manner similar to the second-level user interface. For example, the user may select an item displayed in an N-level user interface (an N-level item) to access an (N−1)-level user interface or select the video window to go back to an (N−1)-level user interface. N may be any integer larger than 1. In one embodiment, the media device no indicates the context in the current level user interface (e.g., the selected items or path leading to the current level user interface).
As described above, when a user selects an item in a user interface of any level by navigating to a direction, the media device no moves the video window to the opposite direction and displays a user interface associated with the selected item in the rest display area. The user interface associated with the selected item may include multiple selectable items. As described above, the media device 110 may display the multiple selectable items in a compass-type layout. When the number of the selectable items is large (e.g., more than about three in some embodiments), the media device 110 may display the items in a vertical (or horizontal) list on one side, and display the video window on the opposite side. Users may navigate through the vertical list by gesturing up or down using the remote control 160. When switching from one level of user interface to another, the media device 110 may insert animation in between, or provide other enhancements as desired.
An example of a user interface is illustrated with respect
After a user presses a menu button on a remote control 160 associated with the media device 110 or takes another action to initiate the interface, media device 110 displays a top-level user interface on the television, as illustrated in the image of
Depending on the user selection, the media device 110 displays a second level user interface associated with the selected top-level item. For example, if the user selected “My Media” by pressing a left arrow or other feature on the remote control 160, the media device no could move the video window to the right of the display, opposite to the user navigation. Device 100 could then display additional items (e.g., “Queue” and “Sling Projector” as illustrated in the screen shot of
Assuming the user selected “Queue” by providing a “left” indication on the remote control 160, the media device no could display a further user interface image such as that shown in
The vertical list as shown also displays items above and below the current item by their titles. In this embodiment the user can navigate to the items displayed above the current item by providing an “up” indication, and to items below by providing a “down” indication on the remote control. The user interface may display additional items and hide existing items as the user scrolls up or down the vertical list. The user interface indicates in the top-left corner that the vertical list is sorted by date, and the user may sort the list by pressing the “Menu” button in the remote control.
In response to the appropriate input from the user (e.g., depressing an “OK” button on remote 160), the media device 110 plays the video program of the current item in the vertical list on the display 120, as illustrated in the screen shot in
After the video program finishes, the media device 110 suitably scales the video down and displays it in a video window in various embodiments. Media device 110 may also notify viewers that they have finished watching the video program, and may display several options for the viewer to select, as illustrated in the screen shot in
Remote Control
With primary reference again to
In one embodiment, the local device includes a remote key router module (hereinafter called a RKR module) 231. The RKR module 231 includes any sort of hardware and/or software features that are configured to determine a target device for commands received and to route (or relay or pass) the commands to the target device (if different from the local device). The RKR module 231 has access to key mapping information, which includes keys acceptable by the local device and by remote devices connected with the local device. Key mapping data may be stored in a lookup table or other logical structure within control logic 205 and/or elsewhere within the local device, as described more fully below. In various embodiments, the RKR module 231 is implemented as a software module or routine that executes on a processor residing within the local device (e.g., media device 110 in
In one embodiment, each device (local or remote) is represented by an agent application 233A-C running in the local device. The agent application 233 includes hardware and/or software features that are configured to route data (e.g., commands and/or content) between the represented device and the RKR module. The RKR module can therefore communicate with the devices through the agent applications 233, rather than communicating with the devices directly. In an exemplary embodiment, RKR module 231 and any agent applications 233 are applets, programs or other modules executing as part of control logic 205 (
Referring now to the embodiment shown in
The RKR module 231 initially waits 810 for commands from associated input devices (e.g., a remote control) as appropriate. After receiving 820 a command (e.g., a remote control key) from an input device, the RKR module 231 determines 830 target device(s) for the command. In one embodiment, the RKR module 231 checks key mapping information to identify device(s) accepting the received command and determine 830 the device(s) as target device(s).
The RKR module 231 determines 840 whether the current active device is a target device. The current active device is any device with which the user is actively engaged. For example, if the user is operating (or watching) video streamed from a source (e.g., a place-shifting device 130) then the current active device will typically be the place-shifting device 130. The current active device can be the local device or a remote device. By default, the local device may be initially considered as the current active device (although other embodiments may assume that a remote device is initially the current active device). The current active device information may be stored together with the key mapping information or elsewhere as desired.
If the RKR module 231 determines 840 that the current active device is a target device, then the RKR module 231 routes 850 the command to an agent application 233A-C representing the current active device. In one embodiment, even if there are target devices other than the current active device, the RKR module routes the command to the current active device. The current active device is the device that the user is actively engaged with. Therefore, the user probably intends the received command for the current active device.
If the RKR module 231 determines 840 that the current active device is not a target device, then the RKR module routes 860 the command to an agent application 233 representing a target device. In one embodiment, if there are multiple target devices, the RKR module routes 860 the command to a target device with a highest priority. The agent 233, then, transmits the appropriate instruction to the remote device over the network or other link to produce the effect desired by the viewer. As shown in
The RKR module 231 (and/or one or more agent applications 233) also determines 870 whether the received command leads to control context change. In one embodiment, one or more commands (or keys) may be configured to indicate an intention to switch control context. For example, the user may press a menu button (or option button) while operating a remote device (e.g., a place-shifting device) to indicate a control context shift and to trigger the local device to display its menu. In another embodiment, depends on user configuration, a command not supported by the current active is considered to lead to control context change to the target device. In yet another embodiment, if the RKR module 231 receives a command that previously led to the most recent control context change, the RKR module 231 determines that the command leads to control context change, and restores previous control context by making the last active device as the current active device.
If it is determined 870 that the command does not lead to control context change, the RKR module 231 waits 810 for the next command. Otherwise, the RKR module 231 determines 880 whether the command is routed 850 to the current active device. If the command is routed 850 to the current active device, it may be determined that the user wants to resume previous engagement with the last active device. Therefore, the RKR module 231 makes 890 the last active device as the current active device. In one embodiment, the agent application of the current active device (or the RKR module 231) determines that the received command is for the last active device, and forwards (or passes) the command to its agent application.
If the command is routed 860 to a target device that is not the current active device, it is determined that the user intends to start engaging with the target device. Therefore, the RKR module 231 makes 895 the target device as the new current active device, and the previous current active device now becomes the last active device. After switching control context, the RKR module 231 resumes waiting 810 for the next command.
The following examples illustrate the method 800 in a place-shifting context. In these examples, the place-shifting device 130 (the remote device) streams video content to the media device 110 (the local device). Users may use the remote control 160 to control the media device 110 of the place-shifting device 130 through the media device. Other embodiments may apply the concepts and techniques described in this example, however, to control any number of other components. Indeed, a single media device could control multiple other devices in various embodiments. Further, a “controlled” device may itself control other devices. A placeshifting device 130, for example, may itself control a DVR 180, DVD player 170, receiver 175 and/or any other device using the techniques described herein, or using an infrared or other wireless “blaster” type device that emulates signals transmitted by a remote control associated with the device. As an example, an instruction transmitted by remote 160 may be received at media device 110, transmitted over a network or other link to a placeshifting device 130, and then relayed via an RF emulator from device 130 to DVR 180, which may itself control receiver 175. Many different scenarios could be formulated across a wide array of equivalent embodiments.
The table below illustrates key mapping information for an exemplary media device 110 and an exemplary place-shifting device 130 used in the following example; other embodiments may use different key mapping information as appropriate.
As illustrated, the remote control 160 has the following ten keys: a Menu key, four direction keys (Left, Right, Up, and Down), an Option key, two channel keys (Channel Up and Channel Down), a Nay key, and a Guide key. The media device 110 accepts (or supports or handles) seven of the ten keys (Menu, Left, Right, Up, Down, Option, and Nay). The place-shifting device 130 accepts nine of the ten keys (Left, Right, Up, Down, Option, Channel Up, Channel Down, Nay, and Guide) in this example.
For example, assume a user is operating the local media device 110. The user presses the Menu key on the remote control 160. The RKR module 231 on the media device 110 receives 820 a signal indicating this user action. Because the Menu key is accepted by a single device, the media device 110, the RKR module 231 determines 830 that it is the target device and routes 850 the Menu key command to the media device 110. The RKR module 231 determines 870 that the Menu key command does not lead to control context change, and the local media device 110 remains as the current active device.
As another example, assume the user is actively operating the remote placeshifting device 130. The RKR module 231 receives 820 a signal indicating the user pressed the Play key on the remote control 160. Both the local media device 110 and the remote place-shifting device 130 accept the Nay key, so the “play” input is at least potentially ambiguous. However, because the place-shifting device 130 is the current active device, the RKR module 231 determines 830 that the placeshifting device 130 is the target device and routes 850 the Nay key command to the place-shifting device 130. The RKR module 231 in this embodiment determines 870 that the Nay key command does not lead to control context change, and the remote place-shifting device 130 remains as the current active device.
As an example of control context change, assume the user is watching video streamed from the place-shifting device 130 through the media device 110 on the display 120. In this context, commands from the remote control 160 are normally related to the place-shifting device 130 (e.g., pause, fast forward). Therefore, the RKR module 231 deems the place-shifting device 130 as the current active device.
As the user presses the Menu key, the RKR module 231 receives 820 a corresponding command from the remote control 160. The RKR module 231 checks key mapping and determines 830 that the local media device 110 is the target device because it is the only device accepting the Menu key. The RKR module 231 determines 840 that the current active device, the place-shifting device 130, is not a target device and routes 360 the Menu key command to the media device 110.
The RKR module 231 determines 870 that the Menu key command leads to a control context change, and determines 880 that the Menu key command is not routed to the current active device. Therefore, the RKR module 231 makes 895 the target device, the local media device 110, as the current active device, and routes subsequent commands to it, unless and/or until receiving a command that leads to control context change (e.g., the user presses the Menu key again or presses a key only accepted by the place-shifting device 130). Therefore, when this control context switch happens, the user can navigate the user interface of the media device 110 using the remote control 160.
In one embodiment, the RKR module 231 determines a priority for each device and routes commands based on priority. For example, the RKR module 231 may route a received command to a device accepting the command and having the highest priority. The priorities may be determined based on factors such as how frequently and/or how recent the user interacts with the devices. The priority may also be pre-determined or assigned by the user.
In one embodiment, users may program soft-keys on the remote control by assigning commands intended for the local device and/or a remote device to the programmable soft-keys as desired. Subsequently, when the user presses these programmed soft-keys, the RKR module 231 automatically sets the control context to be for the local device or the remote device and acts accordingly.
By implementing the method in the local device, a user may control remote devices connected with the local device and the local device through a single control device without specifying the intended target. Therefore, the disclosure provides a non-interruptive method for a user to control multiple devices.
This application claims priority to United States Non-Provisional patent application Ser. No. 12/256,344, filed on Oct. 22, 2008, which claims priority to U.S. Provisional Patent Application Ser. No. 60/981,993, filed on Oct. 23, 2007.
Number | Name | Date | Kind |
---|---|---|---|
3416043 | Jorgensen | Dec 1968 | A |
4254303 | Takizawa | Mar 1981 | A |
5161021 | Tsai | Nov 1992 | A |
5237648 | Mills et al. | Aug 1993 | A |
5386493 | Degen et al. | Jan 1995 | A |
5434590 | Dinwiddie, Jr. et al. | Jul 1995 | A |
5493638 | Hooper et al. | Feb 1996 | A |
5602589 | Vishwanath et al. | Feb 1997 | A |
5661516 | Carles | Aug 1997 | A |
5666426 | Helms | Sep 1997 | A |
5682195 | Hendricks et al. | Oct 1997 | A |
5706290 | Shaw et al. | Jan 1998 | A |
5708961 | Hylton et al. | Jan 1998 | A |
5710605 | Nelson | Jan 1998 | A |
5722041 | Freadman | Feb 1998 | A |
5757416 | Birch et al. | May 1998 | A |
5774170 | Hite et al. | Jun 1998 | A |
5778077 | Davidson | Jul 1998 | A |
5794116 | Matsuda et al. | Aug 1998 | A |
5822537 | Katseff et al. | Oct 1998 | A |
5831664 | Wharton et al. | Nov 1998 | A |
5850482 | Meany et al. | Dec 1998 | A |
5852437 | Wugofski et al. | Dec 1998 | A |
5880721 | Yen | Mar 1999 | A |
5889506 | Lopresti et al. | Mar 1999 | A |
5898679 | Brederveld et al. | Apr 1999 | A |
5909518 | Chui | Jun 1999 | A |
5911582 | Redford et al. | Jun 1999 | A |
5922072 | Hutchinson et al. | Jul 1999 | A |
5936968 | Lyons | Aug 1999 | A |
5968132 | Tokunaga | Oct 1999 | A |
5987501 | Hamilton et al. | Nov 1999 | A |
6002450 | Darbee et al. | Dec 1999 | A |
6008777 | Yiu | Dec 1999 | A |
6014694 | Aharoni et al. | Jan 2000 | A |
6020880 | Naimpally | Feb 2000 | A |
6031940 | Chui et al. | Feb 2000 | A |
6036601 | Heckel | Mar 2000 | A |
6040829 | Croy et al. | Mar 2000 | A |
6043837 | Driscoll, Jr. et al. | Mar 2000 | A |
6049671 | Slivka et al. | Apr 2000 | A |
6075906 | Fenwick et al. | Jun 2000 | A |
6088777 | Sorber | Jul 2000 | A |
6097441 | Allport | Aug 2000 | A |
6104334 | Allport | Aug 2000 | A |
6108041 | Faroudja et al. | Aug 2000 | A |
6115420 | Wang | Sep 2000 | A |
6117126 | Appelbaum et al. | Sep 2000 | A |
6141059 | Boyce et al. | Oct 2000 | A |
6141447 | Linzer et al. | Oct 2000 | A |
6160544 | Hayashi et al. | Dec 2000 | A |
6201536 | Hendricks et al. | Mar 2001 | B1 |
6212282 | Mershon | Apr 2001 | B1 |
6222885 | Chaddha et al. | Apr 2001 | B1 |
6223211 | Hamilton et al. | Apr 2001 | B1 |
6240459 | Roberts et al. | May 2001 | B1 |
6240531 | Spilo et al. | May 2001 | B1 |
6243596 | Kikinis | Jun 2001 | B1 |
6256019 | Allport | Jul 2001 | B1 |
6263503 | Margulis | Jul 2001 | B1 |
6279029 | Sampat et al. | Aug 2001 | B1 |
6282714 | Ghori et al. | Aug 2001 | B1 |
6286142 | Ehreth | Sep 2001 | B1 |
6310886 | Barton | Oct 2001 | B1 |
6340994 | Margulis et al. | Jan 2002 | B1 |
6353885 | Herzi et al. | Mar 2002 | B1 |
6356945 | Shaw et al. | Mar 2002 | B1 |
6357021 | Kitagawa et al. | Mar 2002 | B1 |
6370688 | Hejna, Jr. | Apr 2002 | B1 |
6389467 | Eyal | May 2002 | B1 |
6434113 | Gubbi | Aug 2002 | B1 |
6442067 | Chawla et al. | Aug 2002 | B1 |
6456340 | Margulis | Sep 2002 | B1 |
6466623 | Youn et al. | Oct 2002 | B1 |
6470378 | Tracton et al. | Oct 2002 | B1 |
6476826 | Plotkin et al. | Nov 2002 | B1 |
6487319 | Chai | Nov 2002 | B1 |
6493874 | Humpleman | Dec 2002 | B2 |
6496122 | Sampsell | Dec 2002 | B2 |
6505169 | Bhagavath et al. | Jan 2003 | B1 |
6510177 | De Bonet et al. | Jan 2003 | B1 |
6529506 | Yamamoto et al. | Mar 2003 | B1 |
6553147 | Chai et al. | Apr 2003 | B2 |
6557031 | Mimura et al. | Apr 2003 | B1 |
6564004 | Kadono | May 2003 | B1 |
6567984 | Allport | May 2003 | B1 |
6584201 | Konstantinou et al. | Jun 2003 | B1 |
6584559 | Huh et al. | Jun 2003 | B1 |
6597375 | Yawitz | Jul 2003 | B1 |
6598159 | McAlister et al. | Jul 2003 | B1 |
6600838 | Chui | Jul 2003 | B2 |
6609253 | Swix et al. | Aug 2003 | B1 |
6611530 | Apostolopoulos | Aug 2003 | B1 |
6628716 | Tan et al. | Sep 2003 | B1 |
6642939 | Vallone et al. | Nov 2003 | B1 |
6647015 | Malkemes et al. | Nov 2003 | B2 |
6658019 | Chen et al. | Dec 2003 | B1 |
6665751 | Chen et al. | Dec 2003 | B1 |
6665813 | Forsman et al. | Dec 2003 | B1 |
6668261 | Basso et al. | Dec 2003 | B1 |
6697356 | Kretschmer et al. | Feb 2004 | B1 |
6701380 | Schneider et al. | Mar 2004 | B2 |
6704678 | Minke et al. | Mar 2004 | B2 |
6704847 | Six et al. | Mar 2004 | B1 |
6708231 | Kitagawa | Mar 2004 | B1 |
6718551 | Swix et al. | Apr 2004 | B1 |
6754266 | Bahl et al. | Jun 2004 | B2 |
6754439 | Hensley et al. | Jun 2004 | B1 |
6757851 | Park et al. | Jun 2004 | B1 |
6757906 | Look et al. | Jun 2004 | B1 |
6766376 | Price | Jul 2004 | B2 |
6768775 | Wen et al. | Jul 2004 | B1 |
6771828 | Malvar | Aug 2004 | B1 |
6774912 | Ahmed et al. | Aug 2004 | B1 |
6781601 | Cheung | Aug 2004 | B2 |
6785700 | Masud et al. | Aug 2004 | B2 |
6795638 | Skelley, Jr. | Sep 2004 | B1 |
6798838 | Ngo | Sep 2004 | B1 |
6806909 | Radha et al. | Oct 2004 | B1 |
6807308 | Chui et al. | Oct 2004 | B2 |
6816194 | Zhang et al. | Nov 2004 | B2 |
6816858 | Coden et al. | Nov 2004 | B1 |
6826242 | Ojard et al. | Nov 2004 | B2 |
6834123 | Acharya et al. | Dec 2004 | B2 |
6839079 | Barlow et al. | Jan 2005 | B2 |
6847468 | Ferriere | Jan 2005 | B2 |
6850571 | Tardif | Feb 2005 | B2 |
6850649 | Malvar | Feb 2005 | B1 |
6868083 | Apostolopoulos et al. | Mar 2005 | B2 |
6889385 | Rakib et al. | May 2005 | B1 |
6892359 | Nason et al. | May 2005 | B1 |
6898583 | Rising, III | May 2005 | B1 |
6907602 | Tsai et al. | Jun 2005 | B2 |
6927685 | Wathen | Aug 2005 | B2 |
6930661 | Uchida et al. | Aug 2005 | B2 |
6941575 | Allen | Sep 2005 | B2 |
6944880 | Allen | Sep 2005 | B1 |
6952595 | Ikedo et al. | Oct 2005 | B2 |
6981050 | Tobias et al. | Dec 2005 | B1 |
7016337 | Wu et al. | Mar 2006 | B1 |
7020892 | Levesque et al. | Mar 2006 | B2 |
7032000 | Tripp | Apr 2006 | B2 |
7047305 | Brooks et al. | May 2006 | B1 |
7110558 | Elliott | Sep 2006 | B1 |
7124366 | Foreman et al. | Oct 2006 | B2 |
7151575 | Landry et al. | Dec 2006 | B1 |
7155734 | Shimomura et al. | Dec 2006 | B1 |
7155735 | Ngo et al. | Dec 2006 | B1 |
7184433 | Oz | Feb 2007 | B1 |
7224323 | Uchida et al. | May 2007 | B2 |
7239800 | Bilbrey | Jul 2007 | B2 |
7344084 | DaCosta | Mar 2008 | B2 |
7430686 | Wang et al. | Sep 2008 | B1 |
7464396 | Hejna, Jr. | Dec 2008 | B2 |
7502733 | Andrsen et al. | Mar 2009 | B2 |
7505480 | Zhang et al. | Mar 2009 | B1 |
7565681 | Ngo et al. | Jul 2009 | B2 |
7647614 | Krikorian et al. | Jan 2010 | B2 |
7676823 | Acharya et al. | Mar 2010 | B2 |
7707614 | Krikorian et al. | Apr 2010 | B2 |
7725912 | Margulis | May 2010 | B2 |
7769756 | Krikorian et al. | Aug 2010 | B2 |
7877776 | Krikorian et al. | Jan 2011 | B2 |
7917932 | Krikorian | Mar 2011 | B2 |
7975062 | Krikorian et al. | Jul 2011 | B2 |
7992176 | Margulis | Aug 2011 | B2 |
8041988 | Tarra et al. | Oct 2011 | B2 |
8051454 | Krikorian et al. | Nov 2011 | B2 |
8060609 | Banger et al. | Nov 2011 | B2 |
8099755 | Bajpai et al. | Jan 2012 | B2 |
8218657 | Spilo | Jul 2012 | B2 |
8224982 | Langille | Jul 2012 | B2 |
8266657 | Margulis | Sep 2012 | B2 |
8314893 | Ravi | Nov 2012 | B2 |
8346605 | Krikorian et al. | Jan 2013 | B2 |
8350971 | Malone et al. | Jan 2013 | B2 |
8374085 | Poli et al. | Feb 2013 | B2 |
8381310 | Gangotri et al. | Feb 2013 | B2 |
8406431 | Nandury | Mar 2013 | B2 |
20010021998 | Margulis | Sep 2001 | A1 |
20020004839 | Wine et al. | Jan 2002 | A1 |
20020010925 | Kikinis | Jan 2002 | A1 |
20020012530 | Bruls | Jan 2002 | A1 |
20020031333 | Mano et al. | Mar 2002 | A1 |
20020046404 | Mizutani | Apr 2002 | A1 |
20020053053 | Nagai et al. | May 2002 | A1 |
20020080753 | Lee | Jun 2002 | A1 |
20020090029 | Kim | Jul 2002 | A1 |
20020105529 | Bowser et al. | Aug 2002 | A1 |
20020112247 | Horner et al. | Aug 2002 | A1 |
20020122137 | Chen et al. | Sep 2002 | A1 |
20020131497 | Jang | Sep 2002 | A1 |
20020138843 | Samaan et al. | Sep 2002 | A1 |
20020143973 | Price | Oct 2002 | A1 |
20020147634 | Jacoby et al. | Oct 2002 | A1 |
20020147687 | Breiter et al. | Oct 2002 | A1 |
20020167458 | Baudisch et al. | Nov 2002 | A1 |
20020188818 | Nimura et al. | Dec 2002 | A1 |
20020191575 | Kalavade et al. | Dec 2002 | A1 |
20030001880 | Holtz et al. | Jan 2003 | A1 |
20030028873 | Lemmons | Feb 2003 | A1 |
20030065915 | Yu et al. | Apr 2003 | A1 |
20030093260 | Dagtas et al. | May 2003 | A1 |
20030095791 | Barton et al. | May 2003 | A1 |
20030115167 | Sharif et al. | Jun 2003 | A1 |
20030159143 | Chan | Aug 2003 | A1 |
20030187657 | Erhart et al. | Oct 2003 | A1 |
20030192054 | Birks et al. | Oct 2003 | A1 |
20030200548 | Baran et al. | Oct 2003 | A1 |
20030208612 | Harris et al. | Nov 2003 | A1 |
20030231621 | Gubbi et al. | Dec 2003 | A1 |
20040003406 | Billmaier | Jan 2004 | A1 |
20040052216 | Roh | Mar 2004 | A1 |
20040068334 | Tsai et al. | Apr 2004 | A1 |
20040083301 | Murase et al. | Apr 2004 | A1 |
20040100486 | Flamini et al. | May 2004 | A1 |
20040103340 | Sundareson et al. | May 2004 | A1 |
20040139047 | Rechsteiner et al. | Jul 2004 | A1 |
20040162845 | Kim et al. | Aug 2004 | A1 |
20040162903 | Oh | Aug 2004 | A1 |
20040172410 | Shimojima et al. | Sep 2004 | A1 |
20040205830 | Kaneko | Oct 2004 | A1 |
20040212640 | Mann et al. | Oct 2004 | A1 |
20040216173 | Horoszowski et al. | Oct 2004 | A1 |
20040236844 | Kocherlakota | Nov 2004 | A1 |
20040255249 | Chang et al. | Dec 2004 | A1 |
20050021398 | McCleskey et al. | Jan 2005 | A1 |
20050021830 | Urzaiz et al. | Jan 2005 | A1 |
20050027821 | Alexander et al. | Feb 2005 | A1 |
20050038981 | Connor et al. | Feb 2005 | A1 |
20050044058 | Matthews et al. | Feb 2005 | A1 |
20050050462 | Whittle et al. | Mar 2005 | A1 |
20050053356 | Mate et al. | Mar 2005 | A1 |
20050055595 | Frazer et al. | Mar 2005 | A1 |
20050060759 | Rowe et al. | Mar 2005 | A1 |
20050097542 | Lee | May 2005 | A1 |
20050114852 | Chen et al. | May 2005 | A1 |
20050132351 | Randall et al. | Jun 2005 | A1 |
20050138560 | Lee et al. | Jun 2005 | A1 |
20050198584 | Matthews et al. | Sep 2005 | A1 |
20050204046 | Watanabe | Sep 2005 | A1 |
20050216851 | Hull et al. | Sep 2005 | A1 |
20050223087 | Van Der Stok | Oct 2005 | A1 |
20050227621 | Katoh | Oct 2005 | A1 |
20050229118 | Chiu et al. | Oct 2005 | A1 |
20050246369 | Oreizy et al. | Nov 2005 | A1 |
20050251833 | Schedivy | Nov 2005 | A1 |
20050283791 | McCarthy et al. | Dec 2005 | A1 |
20050288999 | Lerner et al. | Dec 2005 | A1 |
20060011371 | Fahey | Jan 2006 | A1 |
20060031381 | Van Luijt et al. | Feb 2006 | A1 |
20060050970 | Gunatilake | Mar 2006 | A1 |
20060051055 | Ohkawa | Mar 2006 | A1 |
20060095401 | Krikorian et al. | May 2006 | A1 |
20060095471 | Krikorian et al. | May 2006 | A1 |
20060095472 | Krikorian et al. | May 2006 | A1 |
20060095942 | Van Beek | May 2006 | A1 |
20060095943 | Demircin et al. | May 2006 | A1 |
20060107226 | Matthews et al. | May 2006 | A1 |
20060117371 | Margulis | Jun 2006 | A1 |
20060146174 | Hagino | Jul 2006 | A1 |
20060174026 | Robinson et al. | Aug 2006 | A1 |
20060206581 | Howarth et al. | Sep 2006 | A1 |
20060280157 | Karaoguz et al. | Dec 2006 | A1 |
20070003224 | Krikorian et al. | Jan 2007 | A1 |
20070005783 | Saint-Hnillaire et al. | Jan 2007 | A1 |
20070022328 | Tarra et al. | Jan 2007 | A1 |
20070074115 | Patten et al. | Mar 2007 | A1 |
20070076604 | Litwack | Apr 2007 | A1 |
20070097257 | El-Maleh et al. | May 2007 | A1 |
20070168543 | Krikorian et al. | Jul 2007 | A1 |
20070180485 | Dua | Aug 2007 | A1 |
20070198532 | Krikorian et al. | Aug 2007 | A1 |
20070234213 | Krikorian et al. | Oct 2007 | A1 |
20070286596 | Lonn | Dec 2007 | A1 |
20070290876 | Sato et al. | Dec 2007 | A1 |
20080019276 | Takatsuji et al. | Jan 2008 | A1 |
20080037573 | Cohen | Feb 2008 | A1 |
20080059533 | Krikorian | Mar 2008 | A1 |
20080134267 | Moghe et al. | Jun 2008 | A1 |
20080195744 | Bowra et al. | Aug 2008 | A1 |
20080199150 | Candelore | Aug 2008 | A1 |
20080256485 | Krikorian et al. | Oct 2008 | A1 |
20080294759 | Biswas et al. | Nov 2008 | A1 |
20080307456 | Beetcher et al. | Dec 2008 | A1 |
20080307462 | Beetcher et al. | Dec 2008 | A1 |
20080307463 | Beetcher et al. | Dec 2008 | A1 |
20090031375 | Sullivan et al. | Jan 2009 | A1 |
20090074380 | Boston et al. | Mar 2009 | A1 |
20090080448 | Tarra et al. | Mar 2009 | A1 |
20090199248 | Ngo et al. | Aug 2009 | A1 |
20090252219 | Chen et al. | Oct 2009 | A1 |
20100001960 | Williams | Jan 2010 | A1 |
20100005483 | Rao | Jan 2010 | A1 |
20100064055 | Krikorian et al. | Mar 2010 | A1 |
20100064332 | Krikorian et al. | Mar 2010 | A1 |
20100070925 | Einaudi et al. | Mar 2010 | A1 |
20100100915 | Krikorian et al. | Apr 2010 | A1 |
20100129057 | Kulkarni | May 2010 | A1 |
20100146527 | Craib et al. | Jun 2010 | A1 |
20100192184 | Margulis | Jul 2010 | A1 |
20100192185 | Margulis | Jul 2010 | A1 |
20100192188 | Rao | Jul 2010 | A1 |
20110032986 | Banger et al. | Feb 2011 | A1 |
20110033168 | Iyer | Feb 2011 | A1 |
20110035462 | Akella | Feb 2011 | A1 |
20110035466 | Panigrahi | Feb 2011 | A1 |
20110035467 | Thiyagarajan | Feb 2011 | A1 |
20110035668 | Thiyagarajan | Feb 2011 | A1 |
20110035669 | Shirali et al. | Feb 2011 | A1 |
20110035741 | Thiyagarajan | Feb 2011 | A1 |
20110035765 | Shirali | Feb 2011 | A1 |
20110055864 | Shah et al. | Mar 2011 | A1 |
20110090402 | Huntington et al. | Apr 2011 | A1 |
20110158610 | Paul et al. | Jun 2011 | A1 |
20110191456 | Jain | Aug 2011 | A1 |
20110208506 | Gurzhi et al. | Aug 2011 | A1 |
Number | Date | Country |
---|---|---|
1464685 | Dec 2003 | CN |
4407319 | Sep 1994 | DE |
1443766 | Aug 2004 | EP |
2307151 | May 1997 | GB |
19990082855 | Nov 1999 | KR |
20010211410 | Aug 2001 | KR |
0133839 | May 2001 | WO |
0147248 | Jun 2001 | WO |
0193161 | Dec 2001 | WO |
03052552 | Jun 2003 | WO |
2004032511 | Apr 2004 | WO |
2008024723 | Feb 2008 | WO |
Entry |
---|
Lee, M. et al. “Video Frame Rate Control for Non-Guaranteed Network Services with Explicit Rate Feedback,” Globecom'00, 2000 IEEE Global Telecommunications conference, San Francisco, CA, Nov. 27-Dec. 1, 2000; [IEEE Global Telecommunications Conference], New York, NY; IEEE, US, vol. 1,Nov. 27, 2000, pp. 293-297, XP001195580; ISBN: 978-0-7803-6452-3, lines 15-20 of sec. II on p. 293, fig. 1. |
European Patent Office, International Searching Authority, “International Search Report and Written Opinion,” mailed Jun. 4, 2010 for International Application No. PCT/IN2009/000728, filed Dec. 18, 2009. |
Korean Intellectual Property Office “Official Notice of Preliminary Rejection,” issued Jun. 18, 2010; Korean Patent Application No. 10-2008-7021254. |
China State Intellectual Property Office “Office Action” issued Mar. 18, 2010 for Application No. 200680022520.6. |
China State Intellectual Property Office “Office Action” issued Apr. 13, 2010 for Application No. 200580026825.X. |
Canadian Intellectual Property Office “Office Action” mailed Feb. 18, 2010 for Application No. 2569610. |
European Patent Office “European Search Report,” mailed May 7, 2010 for Application No. 06786174.0. |
European Patent Office, International Searching Authority, “International Search Report,” mailed Mar. 30, 2010; International Application PCT/US2009/068468 filed Dec. 27, 2009. |
Qiong, Liu et al. “Digital Rights Management for Content Distribution,” Proceedings of the Australasian Information Security Workshop Conference on ACSW Frontiers 2003, vol. 21, 2003, XP002571073, Adelaide, Australia, ISSN: 1445-1336, ISBN: 1-920682-00-7, sections 2 and 2.1.1. |
China State Intellectual Property Office “First Office Action,” issued Jan. 8, 2010, for Application No. 200810126554.0. |
Australian Government “Office Action,” Australian Patent Application No. 2006240518, mailed Nov. 12, 2009. |
Newton's Telcom Dictionary, 20th ed., Mar. 2004. |
“The Authoritative Dictionary of IEEE Standard Terms,” 7th ed. 2000. |
European Patent Office, European Search Report, mailed Sep. 28, 2009 for European Application No. EP 06 78 6175. |
International Search Report for PCT/US2008/069914 mailed Dec. 19, 2008. |
Newton's Telecom Dictionary, 21st ed., Mar. 2005. |
Ditze M. et all “Resource Adaptation for Audio-Visual Devices in the UPnP QoS Architecture,” Advanced Networking and Applications, 2006; AINA, 2006; 20% H International conference on Vienna, Austria Apr. 18-20, 2006. |
Joonbok, Lee et al. “Compressed High Definition Television (HDTV) Over IPv6,” Applications and the Internet Workshops, 2006; Saint Workshops, 2006; International Symposium, Phoenix, AZ, USA, Jan. 23-27, 2006. |
Lowekamp, B. et al. “A Hierarchy of Network Performance Characteristics for Grid Applications and Services,” GGF Network Measurements Working Group, pp. 1-29, May 24, 2004. |
Meyer, Derrick “MyReplayTV™ Creates First-Ever Online Portal to Personal TI! Service; Gives Viewers Whole New Way to Interact With Programming,” http://web.archive.org/web/20000815052751/http://www.myreplaytv.com/, Aug. 15, 2000. |
Sling Media “Sling Media Unveils Top-of-Line Slingbox PRO-HD” [online], Jan. 4, 2008, XP002560049; retrieved from the Internet: URL:www.slingmedia.com/get/pr-slingbox-pro-hd.html; retrieved on Oct. 12, 2009. |
Srisuresh, P. et al. “Traditional IP Network Address Translator (Traditional NAT),” Network Working Group, The Internet Society, Jan. 2001. |
European Patent Office “International Search Report and Written Opinion” for International Appln. No. PCT/US2009/054893, mailed Dec. 23, 2009. |
China State Intellectual Property Office “First Office Action,” issued Jul. 31, 2009, for Application No. 200580026825.X. |
European Patent Office, International Searching Authority, “International Search Report,” for International Application No. PCT/US2009/049006, mailed Sep. 11, 2009. |
Archive of “TV Brick Home Server,” www.tvbrick.com, [online] [Archived by http://archive.org on Jun. 3, 2004; Retrieved on Apr. 12, 2006] retrieved from the Internet <URL:http://web.archive.org/web/20041107111024/www.tvbrick.com/en/affiliate/tvbs/tvbrick/document18/print>. |
Faucon, B. “TV ‘Brick’ Opens up Copyright Can of Worms,” Financial Review, Jul. 1, 2003, [online [Retrieved on Apr. 12, 2006] Retrieved from the Internet, URL:http://afr.com/cgi-bin/newtextversions.pl?storyid+1056825330084&3ate+2003/07/01&pagetype+printer§ion+1053801318705&path+articles/2003/06/30/0156825330084.html]. |
Balster, Eric J., “Video Compression and Rate Control Methods Based on the Wavelet Transform,” The Ohio State University 2004, pp. 1-24. |
Kulapala et al., “Comparison of Traffic and Quality Characteristics of Rate-Controlled Wavelet and DCT Video,” Arizona State University, Oct. 11, 2004. |
International Search Report and Written Opinion, PCT/US2005/020105, Feb. 15, 2007, 6 pages. |
International Search Report and Written Opinion for PCT/US2006/04382, mailed Apr. 27, 2007. |
Skodras et al., “JPEG2000: The Upcoming Still Image Compression Standard,” May 11, 2000, 14 pages. |
Taubman et al., “Embedded Block Coding in JPEG2000,” Feb. 23, 2001, pp. 1-8 of 36. |
Roe, Kevin, “Third-Party Observation Under EPC Article 115 on the Patentability of an Invention,” Dec. 21, 2007. |
Roe, Kevin, Third-Party Submission for Published Application Under CFR §1.99, Mar. 26, 2008. |
International Search Report and Written Opinion for International Application No. PCT/US2008/080910, mailed Feb. 16, 2009. |
International Search Report and Written Opinion for International Application No. PCT/US2006/025911, mailed Jan. 3, 2007. |
Kessler, Gary C., An Overview of TCP/IP Protocols and the Internet; Jan. 16, 2007, retrieved from the Internet on Jun. 12, 2008 at http://www.garykessler.net/library/tcpip.html; originally submitted to the InterNIC and posted on their Gopher site on Aug. 5, 1994. |
International Search Report for International Application No. PCT/US2007/063599, mailed Dec. 12, 2007. |
International Search Report for International Application No. PCT/US2007/076337, mailed Oct. 20, 2008. |
International Search Report for International Application No. PCT/US2008/059613, mailed Jul. 21, 2008. |
Sony Corporation “LocationFree TV” [Online], 2004, SP002512410; retrieved from the Internet: <URL:http://www.docs.sony.com/release/LFX1—X5revision.pdf>; retrieved on Jan. 28, 2009 [note—document uploaded in two parts as file exceeds the 25MB size limit]. |
Sling Media Inc. “Slingbox User Guide” [Online] 2006, XP002512553; retrieved from the Internet: <URL:http://www.slingmedia.hk/attach/en-US—Slingbox—User—Guide—v12.pdf>; retrieved on Jan. 29, 2009. |
Wikipedia “LocationFree Player” [Online], Sep. 22, 2007, XP002512400; retrieved from the Internet: <URL:http://en.wikipedia.org/w/index.php?title=LocationFree—Player&oldid=159683564>; retrieved on Jan. 28, 2009. |
Capable Networks LLC “Keyspan Remote Control—Controlling Your Computer With a Remote” [Online], Feb. 21, 2006, XP002512495; retrieved from the Internet: <URL:http://www.slingcommunity.com/article/11791/Keyspan-Remote-Control---Controlling-Your-Computer-With-a-Remote/?highlight=remote+control>; retrieved on Jan. 28, 2009. |
Sony Corporation “LocationFree Player Pak—LocationFree Base Station—LocationFree Player” [Online] 2005, XP002512401; retrieved from the Internet: <URL:http://www.docs.sony.com/release/LFPK1.pdf>; retrieved on Jan. 28, 2009. |
Wikipedia “Slingbox” [Online], Oct. 21, 2007, XP002512399; retrieved from the Internet: <URL:http://en.wikipedia.org/w/index.php?title=Slingbox&oldid=166080570>; retrieved on Jan. 28, 2009. |
European Patent Office, European Search Report for European Application No. EP 08 16 7880, mailed Mar. 4, 2009. |
Mythtv Wiki, “MythTV User Manual” [Online], Aug. 27, 2007, XP002515046; retrieved from the Internet: <URL:http://www.mythtv.org/wiki?title=User—Manual:Introduction&oldid=25549>. |
International Searching Authority, Written Opinion and International Search Report for International Application No. PCT/US2008/077733, mailed Mar. 18, 2009. |
International Searching Authority, Written Opinion and International Search Report for International Application No. PCT/US2008/087005, mailed Mar. 20, 2009. |
Watanabe Y. et al., “Multimedia Database System for TV Newscasts and Newspapers”; Lecture Notes in Computer Science, Springer Verlag, Berlin, Germany; vol. 1554, Nov. 1, 1998, pp. 208-220, XP002402824, ISSN: 0302-9743. |
Yasuhiko Watanabe et al., “Aligning Articles in TV Newscasts and Newspapers”; Proceedings of the International Conference on Computationallinguistics, XX, XX, Jan. 1, 1998, pp. 1381-1387, XP002402825. |
Sodergard C. et al., “Integrated Multimedia Publishing: Combining TV and Newspaper Content on Personal Channels”; Computer Networks, Elsevier Science Publishers B.V., Amsterdam, Netherlands; vol. 31, No. 11-16, May 17, 1999, pp. 1111-1128, XP004304543, ISSN: 1389-1286. |
Ariki Y. et al., “Automatic Classification of TV News Articles Based on Telop Character Recognition”; Multimedia Computing and Systems, 1999; IEEE International Conference on Florence, Italy, Jun. 7-11, 1999, Los Alamitos, California, USA, IEEE Comput. Soc. US; vol. 2, Jun. 7, 1999, pp. 148-152, XP010519373, ISBN:978-0-7695-0253-3; abstract, paragraph [03.1], paragraph [052], figures 1,2. |
Sonic Blue “ReplayTV 5000 User's Guide,” 2002, entire document. |
Bluetooth-News; Main Future User Models Document Verification & Qualification: Bluetooth Technical Background, Apr. 21, 1999; pp. 1 of 7 and 2 of 7; http://www.bluetooth.com/v2/news/show.asp 1-2. |
Microsoft Corporation; Harman/Kardon “Master Your Universe” 1999. |
Matsushita Electric Corporation of America MicroCast : Wireless PC Multimedia Transceiver System, Nov. 1998. |
“Wireless Local Area Networks: Issues in Technology and Standards” Jan. 6, 1999. |
Number | Date | Country | |
---|---|---|---|
20130208186 A1 | Aug 2013 | US |
Number | Date | Country | |
---|---|---|---|
60981993 | Oct 2007 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12256344 | Oct 2008 | US |
Child | 13730433 | US |