Methods and apparatus for seeking within a media stream using scene detection

Information

  • Patent Grant
  • 9565479
  • Patent Number
    9,565,479
  • Date Filed
    Monday, August 10, 2009
    14 years ago
  • Date Issued
    Tuesday, February 7, 2017
    7 years ago
Abstract
A system uses generated scene transition frames to allow fast seeking within a media stream. A set of scene transition frames associated with the media stream are generated, then transmitted along with the media stream from a remotely located media source over a network. A subset of the scene transition frames are displayed, allowing a desired scene transition frame to be selected from a subset of the scene transition frames based on user input. The media source can then be displayed (e.g., played) starting from a frame corresponding to the desired scene transition frame.
Description
TECHNICAL FIELD

The present disclosure generally relates to user interfaces used in connection with streamed media, and more particularly relates to methods and apparatus for seeking for a particular scene or time within streamed media.


BACKGROUND

Recently, consumers have expressed significant interest in “place shifting” devices that allow viewing of television or other media content at locations other than their primary television set. Place shifting devices typically packetize media content that can be transmitted over a local or wide area network to a portable computer, mobile phone, personal digital assistant, remote television or other remote device capable of playing back the packetized media stream for the viewer. Placeshifting therefore allows consumers to view their media content from remote locations such as other rooms, hotels, offices, and/or any other locations where portable media player devices can gain access to a wireless or other communications network.


In the context of media streaming, including for example conventional general purpose computers running software for streaming placeshifted media, it is desirable to allow a user to seek within the media time buffer to find a particular scene or time slot, and then continue viewing the media from that point. The user interface for such a seeking operation typically allows a user to move (or “scrub”) an icon such as a play-head icon across a visual representation of a timeline associated with the media. In response, the software then goes to the nearest keyframe, decodes and drops all frames until it reaches the desired position (i.e., time), and displays the correct frame.


Such a system is undesirable in a number of respects. For example, as the user's sole input is through a linear time display, it is often difficult to find a particular scene or transition within the media. That is, it is often the case that a user is far more interested in finding a particular favorite scene within a media stream than a particular discrete time within that stream. Furthermore, with a standard linear scrubber interface, the user is typically not given immediate feedback while scrubbing the icon along the timeline, and the resulting user interface lacks responsiveness.


It is therefore desirable to create systems and methods for seeking within streamed media in a way that is responsive, intuitive, and provides useful scene or scene transition information to the user. These and other desirable features and characteristics will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and this background section.


BRIEF SUMMARY

According to various exemplary embodiments, systems and methods are described for using generated scene transition frames to allow fast seeking within a media stream. A method of viewing a media stream in accordance with one embodiment includes: receiving a set of scene transition frames and a media stream from a remotely located media source over the network; displaying a subset of the scene transition frames; allowing a user to select a desired scene transition frame from the subset of the scene transition frames based on user input; and displaying the media source starting from a frame corresponding to the desired scene transition frame.


A media player system for viewing a media stream received from a remotely located media source over a network includes a memory for storing a set of scene transition frames associated with the media stream, a user interface configured to display a subset of the scene transition frames and receive user input indicating a selected scene transition frame, and a display for displaying for displaying the media source starting from a frame corresponding to the selected scene transition frame.


A method of providing media stream scene information in accordance with another embodiment includes receiving the media stream, generating (e.g., in real-time) a plurality of scene transition frames associated with the media stream; and transmitting the plurality of scene transition frames and the media stream substantially contemporaneously over a network.





BRIEF DESCRIPTION OF THE DRAWING FIGURES

Exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and



FIG. 1 is a block diagram of an exemplary media encoding system;



FIG. 2 is a block diagram of an exemplary media encoding device;



FIG. 3 is a conceptual overview of a typical time line user interface used in connection with a media buffer;



FIG. 4 depicts, conceptually, the generation of scene transition frames in accordance with one embodiment; and



FIGS. 5 and 6 depict, conceptually, exemplary user interfaces for allowing a user to select a desired scene transition frames.





DETAILED DESCRIPTION

The following detailed description of the invention is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.


Turning now to the drawing figures and with initial reference to FIG. 1, an exemplary placeshifting system 100 suitably includes a placeshifting encoder system 102 that receives media content 122 from a content source 106, encodes the received content into a streaming format, and then transmits the encoded media stream 120 to a media player 104 over network 110. The media player 104 suitably receives the encoded stream 120, decodes the stream, and presents the decoded content to a viewer on a television or other display 108. In various embodiments, a server 112 may also be provided to communicate with encoder system 102 and/or player 104 via network 110 to assist these devices in locating each other, maintaining security, providing or receiving content or information, and/or any other features as desired. This feature is not required in all embodiments, however, and the concepts described herein may be deployed in any data streaming application or environment, including placeshifting but also any other media or other data streaming situation.


Placeshifting encoder system 102 is any component, hardware, software logic and/or the like capable of transmitting a packetized stream of media content over network 110. In various embodiments, placeshifting device 102 incorporates suitable encoder and/or transcoder (collectively “encoder”) logic to convert audio/video or other media data 122 into a packetized format that can be transmitted over network 110. The media data 122 may be received in any format, and may be received from any internal or external source 106 such as any sort of broadcast, cable or satellite television programming source, a “video-on-demand” or similar source, a digital video disk (DVD) or other removable media, a video camera, and/or the like. Encoder system 102 encodes media data 122 to create media stream 120 in any manner. In various embodiments, encoder system 102 contains a transmit buffer 105 that temporarily stores encoded data prior to transmission on network 110. As buffer 105 fills or empties, one or more parameters of the encoding (e.g., the bit rate of media stream 120) may be adjusted to maintain desirable picture quality and data throughput in view of the then-current network performance. As described more fully below, various embodiments are able to calculate a current encoding rate and a current network transfer rate, and are able to adjust the encoding rate as the network transfer rate changes. Changes in the network transfer rate may be identified from, for example, changes in the utilization of the outgoing buffer 105.


Several examples of encoding systems 102 may be implemented using any of the various SLINGBOX products available from Sling Media of Foster City, Calif., although other products could be used in other embodiments. Many different types of encoder systems 102 are generally capable of receiving media content 122 from an external source 106 such as any sort of digital video recorder (DVR), set top box (STB), cable or satellite programming source, DVD player, and/or the like. In such embodiments, encoder system 102 may additionally provide commands 124 to the source 106 to produce desired signals 122. Such commands 124 may be provided over any sort of wired or wireless interface, such as an infrared or other wireless transmitter that emulates remote control commands receivable by the source 106. Other embodiments, however, particularly those that do not involve placeshifting, may modify or omit this feature entirely.


In other embodiments, encoder system 102 may be integrated with any sort of content receiving or other capabilities typically affiliated with source 106. Encoder system 102 may be a hybrid STB or other receiver, for example, that also provides transcoding and placeshifting features. Such a device may receive satellite, cable, broadcast and/or other signals that encode television programming or other content received from an antenna, modem, server and/or other source. The receiver may further demodulate or otherwise decode the received signals to extract programming that can be locally viewed and/or place shifted to a remote player 104 as appropriate. Such devices 102 may also include a content database stored on a hard disk drive, memory, or other storage medium to support a personal or digital video recorder (DVR) feature or other content library as appropriate. Hence, in some embodiments, source 106 and encoder system 102 may be physically and/or logically contained within a common component, housing or chassis.


In still other embodiments, encoder system 102 is a software program, applet or the like executing on a conventional computing system (e.g., a personal computer). In such embodiments, encoder system 102 may encode, for example, some or all of a screen display typically provided to a user of the computing system for placeshifting to a remote location. One device capable of providing such functionality is the SlingProjector product available from Sling Media of Foster City, Calif., which executes on a conventional personal computer, although other products could be used as well.


Media player 104 is any device, component, module, hardware, software and/or the like capable of receiving a media stream 120 from one or more encoder systems 102. In various embodiments, remote player 104 is personal computer (e.g., a “laptop” or similarly portable computer, although desktop-type computers could also be used), a mobile phone, a personal digital assistant, a personal media player (such as the ARCHOS products available from the Archos company of Igny, France) or the like. In many embodiments, remote player 104 is a general purpose computing device that includes a media player application in software or firmware that is capable of securely connecting to placeshifting encoder system 102, as described more fully below, and of receiving and presenting media content to the user of the device as appropriate. In other embodiments, however, media player 104 is a standalone or other separate hardware device capable of receiving the media stream 120 via any portion of network 110 and decoding the media stream 120 to provide an output signal 126 that is presented on a television or other display 108. One example of a standalone media receiver 104 is the SLINGCATCHER product available from Sling Media of Foster City, Calif., although other products could be equivalently used.


Network 110 is any digital or other communications network capable of transmitting messages between senders (e.g., encoder system 102) and receivers (e.g., receiver 104). In various embodiments, network 110 includes any number of public or private data connections, links or networks supporting any number of communications protocols. Network 110 may include the Internet, for example, or any other network based upon TCP/IP or other conventional protocols. In various embodiments, network 110 also incorporates a wireless and/or wired telephone network, such as a cellular communications network for communicating with mobile phones, personal digital assistants, and/or the like. Network 110 may also incorporate any sort of wireless or wired local area networks, such as one or more IEEE 802.3 and/or IEEE 802.11 networks.


Encoder system 102 and/or player 104 are therefore able to communicate with player 104 in any manner (e.g., using any sort of data connections 128 and/or 125, respectively). Such communication may take place over a wide area link that includes the Internet and/or a telephone network, for example; in other embodiments, communications between devices 102 and 104 may take place over one or more wired or wireless local area links that are conceptually incorporated within network 110. In various equivalent embodiments, encoder system 102 and receiver 104 may be directly connected via any sort of cable (e.g., an Ethernet cable or the like) with little or no other network functionality provided.


Many different placeshifting scenarios could be formulated based upon available computing and communications resources, consumer demand and/or any other factors. In various embodiments, consumers may wish to placeshift content within a home, office or other structure, such as from a placeshifting encoder system 102 to a desktop or portable computer located in another room. In such embodiments, the content stream will typically be provided over a wired or wireless local area network operating within the structure. In other embodiments, consumers may wish to placeshift content over a broadband or similar network connection from a primary location to a computer or other remote player 104 located in a second home, office, hotel or other remote location. In still other embodiments, consumers may wish to placeshift content to a mobile phone, personal digital assistant, media player, video game player, automotive or other vehicle media player, and/or other device via a mobile link (e.g., a GSM/EDGE or CDMA/EVDO connection, any sort of 3G or subsequent telephone link, an IEEE 802.11 “Wi-fi” link, and/or the like). Several examples of placeshifting applications available for various platforms are provided by Sling Media of Foster City, Calif., although the concepts described herein could be used in conjunction with products and services available from any source.


Encoder system 102, then, generally creates a media stream 120 that is routable on network 110 based upon content 122 received from media source 106. To that end, and with reference now to FIG. 2, encoder system 102 typically includes an encoder module 202, a buffer 105 and a network interface 206 in conjunction with appropriate control logic 205. In operation, encoder module 202 typically receives media content 122 from an internal or external source 106, encodes the data into the desired format for media stream 120, and stores the encoded data in buffer 105. Network interface 206 then retrieves the formatted data from buffer 105 for transmission on network 110. Control module 205 suitably monitors and controls the encoding and network transmit processes carried out by encoding module 202 and network interface 206, respectively, and may perform other features as well. Encoder system 102 may also have a module 208 or other feature capable of generating and providing commands 124 to an external media source 106, as described above.


In the exemplary embodiment shown in FIG. 2, modules 202, 105, 205, 206 and 208 may be implemented in software or firmware residing in any memory, mass storage or other storage medium within encoder system 102 in source code, object code and/or any other format. Such features may be executed on any sort of processor or microcontroller executing within encoder system 102. In various embodiments, encoder system 102 is implemented as a system on a chip (SoC) type system with integrated processing, storage and input/output features. Various SoC hardware implementations are available from Texas Instruments, Conexant Systems, Broadcom Inc., and other suppliers as appropriate. Other embodiments may use any number of discrete and/or integrated processing components, memories, input/output features and/or other features as desired.


As noted above, creating a media stream 120 typically involves encoding and/or transcoding an input media stream 122 received from an internal or external media source 106 into a suitable digital format that can be transmitted on network 110. Generally, the media stream 120 is placed into a standard or other known format (e.g., the WINDOWS MEDIA format available from the Microsoft Corporation of Redmond, Wash. although other formats such as the QUICKTIME format, REALPLAYER format, MPEG format, and/or the like could be used in any other embodiments) that can be transmitted on network 110. This encoding may take place, for example, in any sort of encoding module 202 as appropriate. Encoding module 202 may be any sort of hardware (e.g., a digital signal processor or other integrated circuit used for media encoding), software (e.g., software or firmware programming used for media encoding that executes on the SoC or other processor described above), or the like. Encoding module 202 is therefore any feature that receives media data 122 from the internal or external source 106 (e.g., via any sort of hardware and/or software interface) and encodes or transcodes the received data into the desired format for transmission on network 110. Although FIG. 2 shows a single encoding module 202, in practice system 102 may include any number of encoding modules 202. Different encoding modules 202 may be selected based upon preference of player 104, network conditions, and/or the like.


In various embodiments, encoder 202 may also apply other modifications, transforms and/or filters to the received content before or during the transcoding process. Video signals, for example, may be resized, cropped and/or skewed. Similarly, the color, hue and/or saturation of the signal may be altered, and/or noise reduction or other filtering may be applied. Audio signals may be modified by adjusting volume, sampling rate, mono/stereo parameters, noise reduction, multi-channel sound parameters and/or the like. Digital rights management encoding and/or decoding may also be applied in some embodiments, and/or other features may be applied as desired.


In various embodiment, including those in which media is streamed to media player 104 that comprises a general purpose computer running a media application, a set of scene transition frames are generated and transmitted to media player 104 (e.g., by placeshifting device 102, out-of-band) such that a user may easily search for a particular scene within the media stream.


As a preliminary matter, FIG. 3 shows a simplified view of a typical user interface used to seek within a time shift buffer. In general, a timeline 304 is presented to the user such that its geometry corresponds to the extent of the media being viewed, while the buffer will typically include a subset of the total media that will ultimately be streamed. In this embodiment, for example, timeline 304 is literally a line segment whose endpoints correspond to the beginning and end of the media. To visualize the current location within the stream, an icon 306 (e.g., a rectangular icon as shown) is displayed along timeline 304 at the location that proportionally corresponds to the position of viewed media within window 302. By moving icon 306 back and forth along timeline 304 (to the extent allowed by the buffered information, the user is able to select a particular scene or time within the buffer and begin watching the displayed media 302 at that point. As will be appreciated, it can be difficult to find a particular scene or scene transition (e.g., the point at which a program resumes after a commercial) using a simple linear scrubber interface as illustrated.


In accordance with the present invention, however, a set of scene transition frames are displayed and transmitted along with the media stream. These frames can then be used in connection with a user interface (e.g., as a real-time “storyboard”) to allow the viewer to easily select a desired scene within the media stream.


Referring now to FIG. 4, a media stream 402 can be visualized conceptually as a series of time-wise sequential frames 410 (e.g., 410a-410h), which may have a variety of formats as is known in the art. In this illustration, the various geometric forms within frames 410 are used to depict exemplary content of the frame images as the scenes progress. In general, a subset of scenes 410 will be keyframes placed at regular intervals.


In accordance with the one aspect, a set of scene transition frames 420 are generated by performing substantially real-time analysis of media stream 402. That is, a typical media stream—particularly narrative video works such as TV programs, movies, and the like—will include content made up of a series of scenes. These scenes are generally characterized by more or less contiguous movement, dialog, viewpoint, and/or physical settings.


The system (e.g., placeshifting device 102), suitably examines stream 410 to determine when such scene transitions occur. In the simplified stream 402 shown in FIG. 4, for example, there is a transition from frame 410a to 410b (from a star shape to a triangle shape), and a transition from frame 410e to 410f (from a triangle shape to an ellipse). Stated another way, scene transition frame 420a comprises the first frame after a transition time 405, and scene transition frame 420b comprises the first frame after a transition time 407.


An individual will intuitively regard frames 410b-410e as a single “scene,” notwithstanding the fact that the frames are not identical, because their content is sufficiently contiguous over time (i.e., a triangle translating diagonally within the frame). Similarly, frames 410f-410h are sufficiently contiguous in terms of the rotation and translation of an ellipse over time.


Likewise, the system (e.g., through any suitable combination of hardware and software), attempts to similarly determine scene transitions 405 and 407 via one or more image processing algorithms. That is, the system generates the set of scene transition frames 420 by examining adjacent frames 410 for image content change that is greater than a predetermined threshold or other metric.


Various attributes of image content for frames 410 may be analyzed and compared, including, for example, color, shape, brightness, contrast, hue, saturation, detected edges and any other attribute that can assist in determining scene transitions.


In one embodiment, generation of scene transition frames 420 is performed by placeshifting device 102, and those frames are subsequently sent to media player 104 over network 110 (FIG. 1), where they are stored in a suitable memory or storage medium. Scene transition frames 420 may be sent within the signal used for the media stream itself, but is preferably sent “out-of-band” with respect to the media stream. In this way, media player 104 may consider the two streams of information in parallel. In one embodiment, however, all key frames are cached. This allows, for example, placeshifting device 102 to mark particular key frames (e.g., in-band) as “scene change” frames using conventional techniques, instead of sending a large number of frames out-of-band.


In the interest of providing a responsive and timely user interface, the scene transition frames 410 may be sent substantially contemporaneously with the media stream. This allows the user to select frames in near real time as the media accumulates within media player 104.


In this regard, referring now to FIGS. 5 and 6, the received scene transition frames 420 may be presented to the user in a number of different ways, depending upon the desired user interface 500. In FIG. 5, for example, a subset of the scene transition frames 420 available to the player is depicted as a one-dimensional sequence of thumbnails. Suitable navigation icons (not shown) may be provided for navigating through the entire set. The number and size of thumbnails used for scene transition frames 420 may be selected depending upon screen size, processing power, and any other relevant characteristics of the system.


In a second embodiment, shown in FIG. 6, a two-dimensional array of thumbnails is presented to the user. As with FIG. 5, various additional user interface elements may be provided for “zooming out” or otherwise allowing a more macro view of the available scene transition frames 420.


Regardless of how the subset of frames 420 is displayed, the user is allowed to select a desired scene transition frame 420 using any convenient mode of user interaction, including conventional keyboard and mouse selection techniques. Once the desired scene transition frame 420 is selected, the media player 104 then jumps to that frame (and point of time) within the media stream, and continues playing or otherwise displaying the media stream for the user.


It will be apparent that the various methods and systems described above are advantageous in that, among other things, a user can easily find and select a particular scene by directly selecting it from a storyboard, rather than by roughly traversing a linear user interface and finding the scene through trial and error.


The term “exemplary” is used herein to represent one example, instance or illustration that may have any number of alternates. Any implementation described herein as exemplary is not necessarily to be construed as preferred or advantageous over other implementations. While several exemplary embodiments have been presented in the foregoing detailed description, it should be appreciated that a vast number of alternate but equivalent variations exist, and the examples presented herein are not intended to limit the scope, applicability, or configuration of the invention in any way. To the contrary, various changes may be made in the function and arrangement of elements described without departing from the scope of the claims and their legal equivalents.

Claims
  • 1. A method of providing a media stream from a placeshifting device to a remotely-located media player via a network, the method comprising: receiving a media stream of live content by the placeshifting device in a first format;while the placeshifting device is continuously receiving the media stream of the live content, the placeshifting device transcoding the media stream of the live content from the first format to a second format having at least one parameter suited to then-current network conditions between the placeshifting device and the media player;while transcoding the media stream of the live content, the placeshifting device performing a substantially real-time analysis of the media stream to generate a plurality of scene transition frames associated with the media stream; andwhile continuing to receive and transcode the media stream of the live content, the placeshifting device transmitting the plurality of scene transition frames generated by the real time analysis and the transcoded media stream in the second format substantially contemporaneously over the network to the remotely located media player to thereby allow the remotely-located media player to store and display the plurality of scene transition frames for viewing and subsequent selection and navigation of the media stream by a user of the remotely-located media player;wherein the generation of the plurality of scene transition frames, the storing of the plurality of scene transition frames, the display of the plurality of scene transition frames and the selection of the desired scene transition frames are performed substantially contemporaneously with viewing of the media stream by the user; andwherein the media stream further includes a plurality of key frames, and the plurality of scene transition frames correspond to a marked subset of the key frames, wherein a number of marked scene transition frames is less than a number of key frames included in the media stream.
  • 2. The method of claim 1, wherein the plurality of scene transition frames are received substantially contemporaneously with the media stream.
  • 3. The method of claim 1 wherein the plurality of scene transition frames are displayed as a one-dimensional series of thumbnail images.
  • 4. The method of claim 1, wherein the plurality of scene transition frames are displayed as a two-dimensional array of thumbnail images.
  • 5. The method of claim 1, wherein the plurality of scene transition frames are generated by the placeshifting device examining adjacent frames for image content change that is greater than a predetermined threshold during transcoding of the media stream.
  • 6. A placeshifting device to transmit a media stream of live content to a remotely-located media player via a network, the placeshifting device comprising a network interface to the network, a memory storing instructions, and a processor configured to execute the instructions, wherein the instructions, when executed, cause the placeshifting device to perform operations comprising: receiving the media stream of the live content in a first format via the network interface;while continuously receiving the media stream of the live content, transcoding the media stream of the live content from the first format to a second format having at least one parameter suited to then-current conditions of the network between the placeshifting device and the media player;while transcoding the media stream of the live content, performing a substantially real-time analysis of the media stream to generate a plurality of scene transition frames associated with the media stream; andwhile continuing to receive and transcode the media stream of the live content, transmitting the plurality of scene transition frames generated by the real time analysis and the transcoded media stream in the second format substantially contemporaneously over the network to the remotely located media player to thereby allow the remotely-located media player to store and display the plurality of scene transition frames for viewing and subsequent selection and navigation of the media stream by a user of the remotely-located media player;wherein the generation of the plurality of scene transition frames, the storing of the set plurality of scene transition frames, the display of the plurality of scene transition frames and the selection of the desired scene transition frames are performed substantially contemporaneously with viewing of the media stream by the user; andwherein the media stream further includes a plurality of key frames, and the plurality of scene transition frames correspond to a marked subset of the key frames, wherein a number of marked scene transition frames is less than a number of key frames included in the media stream.
  • 7. The placeshifting device of claim 6 further comprising a buffer coupled to the network interface, wherein the then-current network conditions are indicated by a utilization of the buffer.
  • 8. The placeshifting device of claim 7 wherein the at least one parameter is a bit rate of the media stream.
  • 9. The placeshifting device of claim 7 wherein the media stream is encoded from the first format to the second format as the media stream is received by the placeshifting device.
  • 10. The placeshifting device of claim 9 wherein the media stream is a live television broadcast.
US Referenced Citations (353)
Number Name Date Kind
3416043 Jorgensen Dec 1968 A
4254303 Takizawa Mar 1981 A
5161021 Tsai Nov 1992 A
5206929 Langford et al. Apr 1993 A
5237648 Mills et al. Aug 1993 A
5386493 Degen et al. Jan 1995 A
5434590 Dinwiddie, Jr. et al. Jul 1995 A
5493638 Hooper et al. Feb 1996 A
5590262 Isadore-Barreca Dec 1996 A
5602589 Vishwanath et al. Feb 1997 A
5635982 Zhang Jun 1997 A
5661516 Carles Aug 1997 A
5666426 Helms Sep 1997 A
5682195 Hendricks et al. Oct 1997 A
5706290 Shaw et al. Jan 1998 A
5708767 Yeo Jan 1998 A
5708961 Hylton et al. Jan 1998 A
5710605 Nelson Jan 1998 A
5717879 Moran et al. Feb 1998 A
5722041 Freadman Feb 1998 A
5757416 Birch et al. May 1998 A
5774170 Hite et al. Jun 1998 A
5778077 Davidson Jul 1998 A
5794116 Matsuda et al. Aug 1998 A
5821945 Yeo Oct 1998 A
5822537 Katseff et al. Oct 1998 A
5831664 Wharton et al. Nov 1998 A
5850482 Meany et al. Dec 1998 A
5852437 Wugofski et al. Dec 1998 A
5880721 Yen Mar 1999 A
5884056 Steele Mar 1999 A
5898679 Brederveld et al. Apr 1999 A
5909518 Chui Jun 1999 A
5911582 Redford et al. Jun 1999 A
5922072 Hutchinson et al. Jul 1999 A
5936968 Lyons Aug 1999 A
5968132 Tokunaga Oct 1999 A
5987501 Hamilton et al. Nov 1999 A
6002450 Darbee et al. Dec 1999 A
6006265 Rangan et al. Dec 1999 A
6008777 Yiu Dec 1999 A
6014694 Aharoni et al. Jan 2000 A
6020880 Naimpally Feb 2000 A
6031940 Chui et al. Feb 2000 A
6036601 Heckel Mar 2000 A
6040829 Croy et al. Mar 2000 A
6043837 Driscoll, Jr. et al. Mar 2000 A
6049671 Slivka et al. Apr 2000 A
6075906 Fenwick et al. Jun 2000 A
6088777 Sorber Jul 2000 A
6097441 Allport Aug 2000 A
6104334 Allport Aug 2000 A
6108041 Faroudja et al. Aug 2000 A
6115420 Wang Sep 2000 A
6117126 Appelbaum et al. Sep 2000 A
6141059 Boyce et al. Oct 2000 A
6141447 Linzer et al. Oct 2000 A
6144375 Jain et al. Nov 2000 A
6160544 Hayashi et al. Dec 2000 A
6201536 Hendricks et al. Mar 2001 B1
6212282 Mershon Apr 2001 B1
6219837 Yeo et al. Apr 2001 B1
6222885 Chaddha et al. Apr 2001 B1
6223211 Hamilton et al. Apr 2001 B1
6240459 Roberts et al. May 2001 B1
6240531 Spilo et al. May 2001 B1
6243596 Kikinis Jun 2001 B1
6256019 Allport Jul 2001 B1
6263503 Margulis Jul 2001 B1
6278446 Liou et al. Aug 2001 B1
6279029 Sampat et al. Aug 2001 B1
6282714 Ghori et al. Aug 2001 B1
6286142 Ehreth Sep 2001 B1
6310886 Barton Oct 2001 B1
6340994 Margulis et al. Jan 2002 B1
6353885 Herzi et al. Mar 2002 B1
6356945 Shaw et al. Mar 2002 B1
6357021 Kitagawa et al. Mar 2002 B1
6370688 Hejna, Jr. Apr 2002 B1
6389467 Eyal May 2002 B1
6434113 Gubbi Aug 2002 B1
6442067 Chawla et al. Aug 2002 B1
6456340 Margulis Sep 2002 B1
6459459 Ratakonda Oct 2002 B1
6466623 Youn et al. Oct 2002 B1
6466732 Kimura et al. Oct 2002 B1
6470378 Tracton et al. Oct 2002 B1
6476826 Plotkin et al. Nov 2002 B1
6487319 Chai Nov 2002 B1
6493874 Humpleman Dec 2002 B2
6496122 Sampsell Dec 2002 B2
6505169 Bhagavath et al. Jan 2003 B1
6510177 De Bonet et al. Jan 2003 B1
6514207 Ebadollahi Feb 2003 B2
6529506 Yamamoto et al. Mar 2003 B1
6532043 Kurtze et al. Mar 2003 B1
6553147 Chai et al. Apr 2003 B2
6557031 Mimura et al. Apr 2003 B1
6560281 Black May 2003 B1
6564004 Kadono May 2003 B1
6567984 Allport May 2003 B1
6580437 Liou et al. Jun 2003 B1
6584201 Konstantinou et al. Jun 2003 B1
6584559 Huh et al. Jun 2003 B1
6597375 Yawitz Jul 2003 B1
6598159 McAlister et al. Jul 2003 B1
6600838 Chui Jul 2003 B2
6609253 Swix et al. Aug 2003 B1
6611530 Apostolopoulos Aug 2003 B1
6628713 Kojima et al. Sep 2003 B1
6628716 Tan et al. Sep 2003 B1
6642939 Vallone et al. Nov 2003 B1
6647015 Malkemes et al. Nov 2003 B2
6658019 Chen et al. Dec 2003 B1
6665751 Chen et al. Dec 2003 B1
6665813 Forsman et al. Dec 2003 B1
6678635 Tovinkere et al. Jan 2004 B2
6697356 Kretschmer et al. Feb 2004 B1
6701380 Schneider et al. Mar 2004 B2
6704678 Minke et al. Mar 2004 B2
6704847 Six et al. Mar 2004 B1
6708231 Kitagawa Mar 2004 B1
6718551 Swix et al. Apr 2004 B1
6721361 Covell Apr 2004 B1
6738100 Hampapur May 2004 B2
6754266 Bahl et al. Jun 2004 B2
6754439 Hensley et al. Jun 2004 B1
6757851 Park et al. Jun 2004 B1
6757906 Look et al. Jun 2004 B1
6766376 Price Jul 2004 B2
6768775 Wen et al. Jul 2004 B1
6771828 Malvar Aug 2004 B1
6774912 Ahmed et al. Aug 2004 B1
6781601 Cheung Aug 2004 B2
6785700 Masud et al. Aug 2004 B2
6795638 Skelley, Jr. Sep 2004 B1
6798838 Ngo Sep 2004 B1
6806909 Radha et al. Oct 2004 B1
6807306 Girgensohn Oct 2004 B1
6807308 Chui et al. Oct 2004 B2
6816194 Zhang et al. Nov 2004 B2
6816858 Coden et al. Nov 2004 B1
6826242 Ojard et al. Nov 2004 B2
6834123 Acharya et al. Dec 2004 B2
6839079 Barlow et al. Jan 2005 B2
6847468 Ferriere Jan 2005 B2
6850571 Tardif Feb 2005 B2
6850649 Malvar Feb 2005 B1
6868083 Apostolopoulos et al. Mar 2005 B2
6889385 Rakib et al. May 2005 B1
6892359 Nason et al. May 2005 B1
6898583 Rising, III May 2005 B1
6907602 Tsai et al. Jun 2005 B2
6927685 Wathen Aug 2005 B2
6930661 Uchida et al. Aug 2005 B2
6941575 Allen Sep 2005 B2
6944880 Allen Sep 2005 B1
6952595 Ikedo et al. Oct 2005 B2
6965723 Abe et al. Nov 2005 B1
6981050 Tobias et al. Dec 2005 B1
6985623 Prakash Jan 2006 B2
7016337 Wu et al. Mar 2006 B1
7020892 Levesque et al. Mar 2006 B2
7032000 Tripp Apr 2006 B2
7047305 Brooks et al. May 2006 B1
7089496 Hanes Aug 2006 B2
7110558 Elliott Sep 2006 B1
7124366 Foreman et al. Oct 2006 B2
7151575 Landry et al. Dec 2006 B1
7155734 Shimomura et al. Dec 2006 B1
7155735 Ngo et al. Dec 2006 B1
7184100 Wilf Feb 2007 B1
7184433 Oz Feb 2007 B1
7224323 Uchida et al. May 2007 B2
7239800 Bilbrey Jul 2007 B2
7313183 Bazin et al. Dec 2007 B2
7344084 Dacosta Mar 2008 B2
7355606 Paquette Apr 2008 B2
7406249 Shirakawa Jul 2008 B2
7430686 Wang et al. Sep 2008 B1
7436886 Hannuksela Oct 2008 B2
7464396 Hejna, Jr. Dec 2008 B2
7502733 Andrsen et al. Mar 2009 B2
7505480 Zhang et al. Mar 2009 B1
7565681 Ngo et al. Jul 2009 B2
7634793 Hunleth et al. Dec 2009 B2
7657836 Pan Feb 2010 B2
7725912 Margulis May 2010 B2
7738505 Chang Jun 2010 B2
7751683 Belknap Jul 2010 B1
7804503 Fernandez et al. Sep 2010 B2
7889794 Luo Feb 2011 B2
7941031 Tanikawa et al. May 2011 B2
8032840 Haro et al. Oct 2011 B2
8046688 Adams et al. Oct 2011 B2
8050321 Hannuksela Nov 2011 B2
8099755 Bajpai et al. Jan 2012 B2
8237720 Li et al. Aug 2012 B2
8237864 Chung Aug 2012 B2
8363960 Petersohn Jan 2013 B2
8614705 Lefevre et al. Dec 2013 B2
8629918 Takagi Jan 2014 B2
8639089 Kusunoki et al. Jan 2014 B2
8843952 Pora et al. Sep 2014 B2
20010021998 Margulis Sep 2001 A1
20020004839 Wine et al. Jan 2002 A1
20020010925 Kikinis Jan 2002 A1
20020012526 Sai et al. Jan 2002 A1
20020012530 Bruls Jan 2002 A1
20020031333 Mano et al. Mar 2002 A1
20020046404 Mizutani Apr 2002 A1
20020053053 Nagai et al. May 2002 A1
20020080753 Lee Jun 2002 A1
20020090029 Kim Jul 2002 A1
20020105529 Bowser et al. Aug 2002 A1
20020112247 Horner et al. Aug 2002 A1
20020122137 Chen et al. Sep 2002 A1
20020131497 Jang Sep 2002 A1
20020138843 Samaan et al. Sep 2002 A1
20020143973 Price Oct 2002 A1
20020147634 Jacoby et al. Oct 2002 A1
20020147687 Breiter et al. Oct 2002 A1
20020167458 Baudisch et al. Nov 2002 A1
20020188818 Nimura et al. Dec 2002 A1
20020191575 Kalavade et al. Dec 2002 A1
20030001880 Holtz et al. Jan 2003 A1
20030014752 Zaslavsky Jan 2003 A1
20030028873 Lemmons Feb 2003 A1
20030065915 Yu et al. Apr 2003 A1
20030093260 Dagtas et al. May 2003 A1
20030095791 Barton et al. May 2003 A1
20030115167 Sharif et al. Jun 2003 A1
20030142751 Hannuksela Jul 2003 A1
20030156552 Banker et al. Aug 2003 A1
20030159143 Chan Aug 2003 A1
20030187657 Erhart et al. Oct 2003 A1
20030191776 Obrador Oct 2003 A1
20030192054 Birks et al. Oct 2003 A1
20030208612 Harris et al. Nov 2003 A1
20030231621 Gubbi et al. Dec 2003 A1
20040003406 Billmaier Jan 2004 A1
20040052216 Roh Mar 2004 A1
20040068334 Tsai et al. Apr 2004 A1
20040071157 Feldman et al. Apr 2004 A1
20040083301 Murase et al. Apr 2004 A1
20040100486 Flamini et al. May 2004 A1
20040103340 Sundareson et al. May 2004 A1
20040139047 Rechsteiner et al. Jul 2004 A1
20040139462 Hannuksela Jul 2004 A1
20040162845 Kim et al. Aug 2004 A1
20040162903 Oh Aug 2004 A1
20040172410 Shimojima et al. Sep 2004 A1
20040181545 Deng Sep 2004 A1
20040205830 Kaneko Oct 2004 A1
20040212640 Mann et al. Oct 2004 A1
20040216173 Horoszowski et al. Oct 2004 A1
20040236844 Kocherlakota Nov 2004 A1
20040255249 Chang et al. Dec 2004 A1
20050021398 McCleskey et al. Jan 2005 A1
20050027821 Alexander et al. Feb 2005 A1
20050038981 Connor et al. Feb 2005 A1
20050044058 Matthews et al. Feb 2005 A1
20050050462 Whittle et al. Mar 2005 A1
20050053356 Mate et al. Mar 2005 A1
20050055595 Frazer et al. Mar 2005 A1
20050060759 Rowe et al. Mar 2005 A1
20050097542 Lee May 2005 A1
20050114852 Chen et al. May 2005 A1
20050123058 Greenbaum et al. Jun 2005 A1
20050132351 Randall et al. Jun 2005 A1
20050138560 Lee et al. Jun 2005 A1
20050169371 Lee Aug 2005 A1
20050191041 Braun et al. Sep 2005 A1
20050198584 Matthews et al. Sep 2005 A1
20050204046 Watanabe Sep 2005 A1
20050216851 Hull et al. Sep 2005 A1
20050227621 Katoh Oct 2005 A1
20050229118 Chiu et al. Oct 2005 A1
20050246369 Oreizy et al. Nov 2005 A1
20050251833 Schedivy Nov 2005 A1
20050283791 McCarthy et al. Dec 2005 A1
20050283798 Hunleth et al. Dec 2005 A1
20050288999 Lerner et al. Dec 2005 A1
20060011371 Fahey Jan 2006 A1
20060031381 Van Luijt et al. Feb 2006 A1
20060050970 Gunatilake Mar 2006 A1
20060051055 Ohkawa Mar 2006 A1
20060078305 Arora et al. Apr 2006 A1
20060095401 Krikorian et al. May 2006 A1
20060095471 Krikorian et al. May 2006 A1
20060095472 Krikorian et al. May 2006 A1
20060095942 Van Beek May 2006 A1
20060095943 Demircin et al. May 2006 A1
20060104266 Pelletier May 2006 A1
20060107226 Matthews et al. May 2006 A1
20060117371 Margulis Jun 2006 A1
20060143650 Tanikawa Jun 2006 A1
20060146174 Hagino Jul 2006 A1
20060174021 Osborne Aug 2006 A1
20060280157 Karaoguz et al. Dec 2006 A1
20070003224 Krikorian et al. Jan 2007 A1
20070005783 Saint-Hillaire et al. Jan 2007 A1
20070022328 Tarra et al. Jan 2007 A1
20070074115 Patten et al. Mar 2007 A1
20070074117 Tian Mar 2007 A1
20070076604 Litwack Apr 2007 A1
20070168543 Krikorian et al. Jul 2007 A1
20070180485 Dua Aug 2007 A1
20070198532 Krikorian et al. Aug 2007 A1
20070201746 Kim Aug 2007 A1
20070234213 Krikorian et al. Oct 2007 A1
20070286596 Lonn Dec 2007 A1
20080019276 Takatsuji et al. Jan 2008 A1
20080037573 Cohen Feb 2008 A1
20080059533 Krikorian Mar 2008 A1
20080095228 Hannuksela et al. Apr 2008 A1
20080134267 Moghe et al. Jun 2008 A1
20080195744 Bowra et al. Aug 2008 A1
20080199150 Candelore Aug 2008 A1
20080232687 Petersohn Sep 2008 A1
20080263621 Austerlitz et al. Oct 2008 A1
20080294759 Biswas et al. Nov 2008 A1
20080307456 Beetcher et al. Dec 2008 A1
20080307462 Beetcher et al. Dec 2008 A1
20080307463 Beetcher et al. Dec 2008 A1
20090041117 Hannuksela Feb 2009 A1
20090074380 Boston et al. Mar 2009 A1
20090079840 Gandhi et al. Mar 2009 A1
20090080864 Rajakarunanayake Mar 2009 A1
20090097500 Diab et al. Apr 2009 A1
20090102983 Malone et al. Apr 2009 A1
20090103607 Bajpai et al. Apr 2009 A1
20090109341 Oguz et al. Apr 2009 A1
20090115845 Walls May 2009 A1
20090150406 Giblin Jun 2009 A1
20090199248 Ngo et al. Aug 2009 A1
20090207316 Cupal Aug 2009 A1
20100050080 Libert et al. Feb 2010 A1
20100050083 Axen et al. Feb 2010 A1
20100070483 Delgo et al. Mar 2010 A1
20100070925 Einaudi et al. Mar 2010 A1
20100086022 Hunleth et al. Apr 2010 A1
20100100915 Krikorian et al. Apr 2010 A1
20100166063 Perlman et al. Jul 2010 A1
20100303439 Doser et al. Dec 2010 A1
20110013882 Kusunoki et al. Jan 2011 A1
20110093560 Morris Apr 2011 A1
20110182561 Bae Jul 2011 A1
20110221927 Takagi Sep 2011 A1
20140002749 Pora et al. Jan 2014 A1
20140007152 Pora et al. Jan 2014 A1
20140267337 Keohane et al. Sep 2014 A1
20140282690 Keohane et al. Sep 2014 A1
Foreign Referenced Citations (28)
Number Date Country
1464685 Dec 2003 CN
4407319 Sep 1994 DE
0838945 Apr 1998 EP
1077407 Feb 2001 EP
1443766 Aug 2004 EP
1691550 Aug 2006 EP
1830558 Sep 2007 EP
2307151 May 1997 GB
2003046582 Feb 2003 JP
2003114845 Apr 2003 JP
2004015111 Jan 2004 JP
19990082855 Nov 1999 KR
20010211410 Aug 2001 KR
0133839 May 2001 WO
0147248 Jun 2001 WO
0193161 Dec 2001 WO
03026232 Mar 2003 WO
03052552 Jun 2003 WO
03098897 Nov 2003 WO
2004032511 Apr 2004 WO
2005050898 Jun 2005 WO
2006064454 Jun 2006 WO
2006074110 Jul 2006 WO
2007027891 Mar 2007 WO
2007051156 May 2007 WO
2007141555 Dec 2007 WO
2007149466 Dec 2007 WO
2008024723 Feb 2008 WO
Non-Patent Literature Citations (143)
Entry
International Search Report and Written Opinion for International Application No. PCT/US2006/025911, mailed Jan. 3, 2007.
International Search Report for International Application No. PCT/US2007/063599, mailed Dec. 12, 2007.
International Search Report for International Application No. PCT/US2007/076337, mailed Oct. 20, 2008.
International Search Report and Written Opinion for International Application No. PCT/US2006/025912, mailed Jul. 17, 2008.
International Search Report for International Application No. PCT/US2008/059613, mailed Jul. 21, 2008.
International Search Report and Written Opinion for International Application No. PCT/US2008/080910, mailed Feb. 16, 2009.
Wikipedia “Slingbox” [Online], Oct. 21, 2007, XP002512399; retrieved from the Internet: <URL:http://en.wikipedia.org/w/index.php?title=Slingbox&oldid=166080570>; retrieved on Jan. 28, 2009.
Wikipedia “LocationFree Player” [Online], Sep. 22, 2007, XP002512400; retrieved from the Internet: <URL: http://en.wikipedia.org/w/index.php?title=LocationFree—Player&oldid=159683564>; retrieved on Jan. 28, 2009.
Capable Networks LLC “Keyspan Remote Control—Controlling Your Computer With a Remote” [Online], Feb. 21, 2006, XP002512495; retrieved from the Internet: <URL:http://www.slingcommunity.com/article/11791/Keyspan-Remote-Control---Controlling-Your-Computer-With-a-Remote/?highlight=remote+control>; retrieved on Jan. 28, 2009.
Sling Media Inc. “Slingbox User Guide” [Online] 2006, XP002512553; retrieved from the Internet: <URL:http://www.slingmedia.hk/attach/en-US—Slingbox—User—Guide—v.12.pdf>; retrieved on Jan. 29, 2009.
Sony Corporation “LocationFree TV” [Online], 2004, SP002512410; retrieved from the Internet: <URL:http://www.docs.sony.com/release/LFX1—X5revision.pdf>, retrieved on Jan. 28, 2009 [note—document uploaded in two parts as file exceeds the 25MB size limit].
Sony Corporation “LocationFree Player Pak—LocationFree Base Station—LocationFree Player” [Online] 2005, XP002512401; retrieved from the Internet: <URL:http://www.docs.sony.com/release/LFPK1.pdf>; retrieved on Jan. 28, 2009.
Krikorian, Jason, U.S. Appl. No. 11/734,277, filed Apr. 12, 2007.
Tarra, Raghuveer et al., U.S. Appl. No. 60/975,239, filed Sep. 26, 2007.
Williams, George Edward, U.S. Appl. No. 12/167,041, filed Jul. 2, 2008.
Rao, Padmanabha R., U.S. Appl. No. 12/166,039, filed Jul. 1, 2008.
International Search Report and Written Opinion, PCT/US2005/020105, Feb. 15, 2007, 6 pages.
International Search Report and Written Opinion for PCT/US2006/04382, mailed Apr. 27, 2007.
Archive of “TV Brick Home Server,” www.tvbrick.com, [online] [Archived by http://archive.org on Jun. 3, 2004; Retrieved on Apr. 12, 2006] retrieved from the Internet <URL:http://web.archive.org/web/20041107111024/www.tvbrick.com/en/affilliate/tvbs/tvbrick/document18/print>.
Faucon, B. “TV ‘Brick’ Opens up Copyright Can of Worms,” Financial Review, Jul. 1, 2003, [online [Retrieved on Apr. 12, 2006] Retrieved from the Internet, URL:http://afr.com/cgi-bin/newtextversions.pl?storyid+1056825330084&3ate+2003/07/01&pagetype+printer&section+1053801318705&path+articles/2003/06/30/0156825330084.html].
Balster, Eric J., “Video Compression and Rate Control Methods Based on the Wavelet Transform,” The Ohio State University 2004, pp. 1-24.
Kulapala et al., “Comparison of Traffic and Quality Characteristics of Rate-Controlled Wavelet and DCT Video,” Arizona State University, Oct. 11, 2004.
Skodras et al., “JPEG2000: The Upcoming Still Image Compression Standard,” May 11, 2000, 14 pages.
Taubman et al., “Embedded Block Coding in JPEG2000,” Feb. 23, 2001, pp. 1-8 of 36.
Kessler, Gary C., An Overview of TCP/IP Protocols and the Internet; Jan. 16, 2007, retrieved from the Internet on Jun. 12, 2008 at http://www.garykessler.net/library/tcpip.html; originally submitted to the InterNIC and posted on their Gopher site on Aug. 5, 1994.
Roe, Kevin, “Third-Party Observation Under EPC Article 115 on the Patentability of an Invention,” Dec. 21, 2007.
Roe, Kevin, Third-Party Submission for Published Application Under CFR §1.99, Mar. 26, 2008.
Bajpai, Parimal et al. “Systems and Methods of Controlling the Encoding of a Media Stream,” U.S. Appl. No. 12/339,878, filed Dec. 19, 2008.
Malone, Edward D. et al. “Systems and Methods for Controlling Media Devices,” U.S. Appl. No. 12/256,344, filed Oct. 22, 2008.
Banger, Shashidhar et al. “Systems and Methods for Determining Attributes of Media Items Accessed via a Personal Media Broadcaster,” U.S. Appl. No. 12/334,959, filed Dec. 15, 2008.
Kulkarni, Anant Madhava “Systems and Methods for Creating Logical Media Streams for Media Storage and Playback,” U.S. Appl. No. 12/323,907, filed Nov. 26, 2008.
Rao, Padmanabha R. “Systems and Methods for Linking Media Content,” U.S. Appl. No. 12/359,784, filed Jan. 26, 2009.
Krikorian, Blake Gary et al. “Systems and Methods for Presenting Media Content Obtained From Multiple Sources,” U.S. Appl. No. 12/408,456, filed Mar. 20, 2009.
Krikorian, Blake Gary et al. “Systems and Methods for Projecting Images From a Computer System,” U.S. Appl. No. 12/408,460, filed Mar. 20, 2009.
China State Intellectual Property Office “First Office Action,” issued Jul. 31, 2009, for Application No. 200580026825.X.
USPTO, Non-Final Office Action, mailed Aug. 4, 2009; U.S. Appl. No. 11/734,277, filed Apr. 12, 2007.
USPTO, Final Office Action, mailed Jul. 31, 2009; U.S. Appl. No. 11/683,862, filed Mar. 8, 2007.
USPTO, Non-Final Office Action, mailed Aug. 5, 2009; U.S. Appl. No. 11/147,663, filed Jun. 7, 2005.
USPTO, Non-Final Office Action, mailed Sep. 3, 2009; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
Einaudi, Andrew E. et al. “Systems and Methods for Selecting Media Content Obtained from Multiple Sources,” U.S. Appl. No. 12/543,278, filed Aug. 18, 2009.
Malode, Deepak Ravi “Remote Control and Method for Automatically Adjusting the Volume Output of an Audio Device,” U.S. Appl. No. 12/550,145, filed Aug. 28, 2009.
Akella, Aparna Sarma “Systems and Methods for Event Programming via a Remote Media Player,” U.S. Appl. No. 12/537,057, filed Aug. 6, 2009.
Shah, Bhupendra Natwerlan et al. “Systems and Methods for Transcoding and Place Shifting Media Content,” U.S. Appl. No. 12/548,130, filed Aug. 26, 2009.
Banger, Shashidhar et al. “Systems and Methods for Automatically Controlling the Resolution of Streaming Video Content,” U.S. Appl. No. 12/537,785, filed Aug. 7, 2009.
Panigrahi, Biswaranjan “Home Media Aggregator System and Method,” U.S. Appl. No. 12/538,681, filed Aug. 10, 2009.
Nandury, Venkata Kishore “Adaptive Gain Control for Digital Audio Samples in a Media Stream,” U.S. Appl. No. 12/507,971, filed Jul. 23, 2009.
Shirali, Amey “Systems and Methods for Providing Programming Content,” U.S. Appl. No. 12/538,676, filed Aug. 10, 2009.
Thiyagarajan, Venkatesan “Systems and Methods for Virtual Remote Control of Streamed Media,” U.S. Appl. No. 12/538,664, filed Aug. 10, 2009.
Thiyagarajan, Venkatesan et al. “Localization Systems and Method,” U.S. Appl. No. 12/538,783, filed Aug. 10, 2009.
Lucas, Brian et al. “Systems and Methods for Establishing Connections Between Devices Communicating Over a Network,” U.S. Appl. No. 12/426,103, filed Apr. 17, 2009.
Thiyagarajan, Venkatesan “Systems and Methods for Updating Firmware Over a Network,” U.S. Appl. No. 12/538,661, filed Aug. 10, 2009.
Iyer, Satish “Methods and Apparatus for Fast Seeking Within a Media Stream Buffer,” U.S. Appl. No. 12/538,659, filed Aug. 10, 2009.
European Patent Office, International Searching Authority, “International Search Report,” for International Application No. PCT/US2009/049006, mailed Sep. 11, 2009.
Conway, Frank et al. “Systems and Methods for Creating Variable Length Clips from a Media Stream,” U.S. Appl. No. 12/347,465, filed Dec. 31, 2008.
USPTO, Final Office Action, mailed Nov. 6, 2009; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action mailed Nov. 12, 2009; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Non-Final Office Action mailed Nov. 23, 2009; U.S. Appl. No. 11/683,862, filed Mar. 8, 2007.
USPTO, Non-Final Office Action mailed Oct. 1, 2009; U.S. Appl. No. 11/778,287, filed Jul. 16, 2007.
USPTO Final Office Action mailed Dec. 30, 2009; U.S. Appl. No. 11/147,664, filed Jun. 7, 2005.
European Patent Office, European Search Report, mailed Sep. 28, 2009 for European Application No. EP 06 78 6175.
International Search Report for PCT/US2008/069914 mailed Dec. 19, 2008.
PCT Partial International Search, PCT/US2009/054893, mailed Dec. 23, 2009.
Newton's Telecom Dictionary, 21st ed., Mar. 2005.
Ditze M. et all “Resource Adaptation for Audio-Visual Devices in the UPnP QoS Architecture,” Advanced Networking and Applications, 2006; AINA, 2006; 20% H International conference on Vienna, Austria Apr. 18-20, 2006.
Joonbok, Lee et al. “Compressed High Definition Television (HDTV) Over IPv6,” Applications and the Internet Workshops, 2006; Saint Workshops, 2006; International Symposium, Phoenix, AZ, USA, Jan. 23-27, 2006.
Lowekamp, B. et al. “A Hierarchy of Network Performance Characteristics for Grid Applications and Services,” GGF Network Measurements Working Group, pp. 1-29, May 24, 2004.
Meyer, Derrick “MyReplayTV™ Creates First-Ever Online Portal to Personal TI! Service; Gives Viewers Whole New Way to Interact With Programming,” http://web.archive.org/web/20000815052751/http://www.myreplaytv.com/, Aug. 15, 2000.
Sling Media “Sling Media Unveils Top-of-Line Slingbox PRO-HD” [online], Jan. 4, 2008, XP002560049; retrieved from the Internet: URL:www.slingmedia.com/get/pr-slingbox-pro-hd.html; retrieved on Oct. 12, 2009.
Srisuresh, P. et al. “Traditional IP Network Address Translator (Traditional NAT),” Network Working Group, The Internet Society, Jan. 2001.
Asnis, Ilya et al. “Mediated Network address Translation Traversal” U.S. Appl. No. 12/405,039, filed Mar. 16, 2009.
Thiyagarajan, Venkatesan et al. “Always-On-Top Media Player Launched From a Web Browser,” U.S. Appl. No. 12/617,271, filed Nov. 12, 2009.
Paul, John Michael et al. “Systems and Methods for Delivering Messages Over a Network,” U.S. Appl. No. 12/619,192, filed Nov. 16, 2009.
Rao, Padmanabha R. et al. “Methods and Apparatus for Establishing Network Connections Using an Inter-Mediating Device,” U.S. Appl. No. 12/642,368, filed Dec. 18, 2009.
Dham, Vikram et al. “Systems and Methods for Establishing Network Connections Using Local Mediation Services,” U.S. Appl. No. 12/644,918, filed Dec. 22, 2009.
Paul, John et al. “Systems and Methods for Remotely Controlling Media Server via a Network,” U.S. Appl. No. 12/645,870, filed Dec. 23, 2009.
Bajpal, Parimal et al. “Method and Node for Transmitting Data Over a Communication Network using Negative Ackhowledgement,” U.S. Appl. No. 12/404,920, filed Mar. 16, 2009.
Bajpal, Parimal et al. “Method and Note for Employing Network connections Over a Connectinoless Transport Layer Protocol,” U.S. Appl. No. 12/405,062, filed Mar. 16, 2009.
China State Intellectual Property Office “Office Action” issued Mar. 18, 2010 for Application No. 200680022520.6.
China State Intellectual Property Office “Office Action” issued Apr. 13, 2010 for Application No. 200580026825.X.
Canadian Intellectual Property Office “Office Action” mailed Feb. 18, 2010 for Application No. 2569610.
European Patent Office “European Search Report,” mailed May 7, 2010 for Application No. 06786174.0.
Margulis, Neal “Apparatus and Method for Effectively Implementing a Wireless Television System,” U.S. Appl. No. 12/758,193, filed Apr. 12, 2010.
Margulis, Neal “Apparatus and Method for Effectively Implementing a Wireless Television System,” U.S. Appl. No. 12/758,194, filed Apr. 12, 2010.
Margulis, Neal “Apparatus and Method for Effectively Implementing a Wireless Television System,” U.S. Appl. No. 12/758,196, filed Apr. 12, 2010.
Kirkorian, Jason Gary et al. “Personal Media Broadcasting System with Output Buffer,” U.S. Appl. No. 12/757,697, filed Apr. 9, 2010.
Tarra, Raghuveer et al. “Firmware Update for Consumer Electronic Device,” U.S. Appl. No. 12/757,714, filed Apr. 9, 2010.
European Patent Office, European Search Report for European Application No. EP 08 16 7880, mailed Mar. 4, 2009.
MythTV Wiki, “MythTV User Manual” [Online], Aug. 27, 2007, XP002515046; retrieved from the Internet: <URL: http://www.mythtv.org/wiki?title=User—Manual:Introduction&oldid=25549>.
International Searching Authority, Written Opinion and International Search Report for International Application No. PCT/US2008/077733, mailed Mar. 18, 2009.
International Searching Authority, Written Opinion and International Search Report for International Application No. PCT/US2008/087005, mailed Mar. 20, 2009.
Watanabe Y. et al., “Multimedia Database System for TV Newscasts and Newspapers”; Lecture Notes in Computer Science, Springer Verlag, Berlin, Germany; vol. 1554, Nov. 1, 1998, pp. 208-220, XP002402824, ISSN: 0302-9743.
Yasuhiko Watanabe et al., “Aligning Articles in TV Newscasts and Newspapers”; Proceedings of the International Conference on Computationallinguistics, XX, XX, Jan. 1, 1998, pp. 1381-1387, XP002402825.
Sodergard C. et al., “Integrated Multimedia Publishing: Combining TV and Newspaper Content on Personal Channels”; Computer Networks, Elsevier Science Publishers B.V., Amsterdam, Netherlands; vol. 31, No. 11-16, May 17, 1999, pp. 1111-1128, XP004304543, ISSN: 1389-1286.
Ariki Y. et al., “Automatic Classification of TV News Articles Based on Telop Character Recognition”; Multimedia Computing and Systems, 1999; IEEE International Conference on Florence, Italy, Jun. 7-11, 1999, Los Alamitos, California, USA, IEEE Comput. Soc. US; vol. 2, Jun. 7, 1999, pp. 148-152, XP010519373, ISBN: 978-0-7695-0253-3; abstract, paragraph [03.1], paragraph [052], figures 1,2.
USPTO, Non-Final Office Action mailed Dec. 17, 2004; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action mailed Jul. 28, 2005; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action mailed Jan. 30, 2006; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action mailed Aug. 10, 2006; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action mailed Jun. 19, 2007; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action mailed Apr. 16, 2008; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Final Office Action mailed Sep. 18, 2008; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action mailed Mar. 31, 2009; U.S. Appl. No. 09/809,868, filed Mar. 15, 2001.
USPTO, Non-Final Office Action mailed May 1, 2008; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO, Final Office Action mailed Dec. 29, 2008; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO, Non-Final Office Action mailed Jun. 8, 2009; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO, Non-Final Office Action mailed Jun. 26, 2008; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Final Office Action mailed Oct. 21, 2008; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Non-Final Office Action mailed Mar. 25, 2009; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO, Non-Final Office Action mailed Aug. 7, 2008; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
USPTO, Final Office Action mailed Feb. 9, 2009; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
USPTO, Non-Final Office Action mailed Feb. 25, 2009; U.S. Appl. No. 11/683,862, filed Mar. 8, 2007.
USPTO, Non-Final Office Action mailed Dec. 24, 2008; U.S. Appl. No. 11/147,985, filed Jun. 7, 2005.
USPTO, Non-Final Office Action mailed Jun. 25, 2008; U.S. Appl. No. 11/428,254, filed Jun. 30, 2006.
USPTO, Final Office Action mailed Feb. 6, 2009; U.S. Appl. No. 11/428,254, filed Jun. 30, 2006.
USPTO, Non-Final Office Action mailed May 15, 2009; U.S. Appl. No. 11/147,664, filed Jun. 7, 2005.
Sonic Blue “ReplayTV 5000 User's Guide,” 2002, entire document.
Bluetooth-News; Main Future User Models Document Verification & Qualification: Bluetooth Technical Background, Apr. 21, 1999; pp. 1 of 7 and 2 of 7; http://www.bluetooth.com/v2/news/show.asp 1-2.
Microsoft Corporation; Harman/Kardon “Master Your Universe” 1999.
Matsushita Electric Corporation of America MicroCast : Wireless PC Multimedia Transceiver System, Nov. 1998.
“Wireless Local Area Networks: Issues in Technology and Standards” Jan. 6, 1999.
USPTO, Final Office Action mailed Jun. 25, 2009; U.S. Appl. No. 11/147,985, filed Jun. 7, 2005.
China State Intellectual Property Office “First Office Action,” issued Jan. 8, 2010, for Application No. 200810126554.0.
USPTO Final Office action mailed Jan. 25, 2010; U.S. Appl. No. 11/734,277, filed Apr. 12, 2007.
Australian Government “Office Action,” Australian Patent Application No. 2006240518, mailed Nov. 12, 2009.
Jain, Vikal Kumar “Systems and Methods for Coordinating Data Communication Between Two Device,” U.S. Appl. No. 12/699,280, filed Feb. 3, 2010.
Gangotri, Arun L. et al. “Systems and Methods and Program Applications for Selectively Restructuring the Placeshiftnig of Copy Protected Digital Media Content,” U.S. Appl. No. 12/623,955, filed Nov. 23, 2009.
Paul, John et al. “Systems and Methods for Searching Media Content,” U.S. Appl. No. 12/648,024, filed Dec. 28, 2009.
Newton's Telcom Dictionary, 20th ed., Mar. 2004.
“The Authoritative Dictionary of IEEE Standard Terms,” 7th ed. 2000.
Gurzhi, Alexander et al. “Systems and Methods for Emulation Network-Enabled Media Components,” U.S. Appl. No. 12/711,830, filed Feb. 24, 2010.
USPTO Final Office Action mailed Mar. 3, 2010; U.S. Appl. No. 11/111,265, filed Apr. 21, 2005.
USPTO Final Office Action mailed Mar. 12, 2010; U.S. Appl. No. 11/620,711, filed Jan. 7, 2007.
Lee, M. et al. “Video Frame Rate Control for Non-Guaranteed Network Services with Explicit Rate Feedback,” Globecom'00, 2000 IEEE Global Telecommunications conference, San Francisco, CA, Nov. 27-Dec. 1, 2000; [IEEE Global Telecommunications Conference], New York, NY; IEEE, US, vol. 1,Nov. 27, 2000, pp. 293-297, XP001195580; ISBN: 978-0-7803-6452-3, lines 15-20 of sec. II on p. 293, fig. 1.
European Patent Office, International Searching Authority, “International Search Report and Written Opinion,” mailed Jun. 4, 2010 for International Application No. PCT/IN2009/000728, filed Dec. 18, 2009.
USPTO Non-Final Office Action mailed Jun. 23, 2010; U.S. Appl. No. 11/933,969, filed Nov. 1, 2007.
Korean Intellectual Property Office “Official Notice of Preliminary Rejection,” issued Jun. 18, 2010; Korean Patent Application No. 10-2008-7021254.
Japan Patent Office “Notice of Grounds for Rejection (Office Action),” mailed May 25, 2010; Patent Application No. 2007-0268269.
Japan Patent Office “Notice of Grounds for Rejection (Office Action),” mailed May 25, 2010; Patent Application No. 2007-527683.
European Patent Office, International Searching Authority, “International Search Report,” mailed Mar. 30, 2010; International Application PCT/US2009/068468 filed Dec. 27, 2009.
USPTO Non-Final Office Action mailed Mar. 19, 2010; U.S. Appl. No. 11/147,664, filed Jun. 7, 2005.
USPTO Non-Final Office Action mailed Mar. 31, 2010; U.S. Appl. No. 11/620,707, filed Jan. 7, 2007.
USPTO Non-Final Office Action mailed Apr. 1, 2010; U.S. Appl. No. 12/237,103, filed Sep. 24, 2008.
Qiong, Liu et al. “Digital Rights Management for Content Distribution,” Proceedings of the Australasian Information Security Workshop Conference on ACSW Frontiers 2003, vol. 21, 2003, XP002571073, Adelaide, Australia, ISSN: 1445-1336, ISBN: 1-920682-00-7, sections 2 and 2.1.1.
Related Publications (1)
Number Date Country
20110035669 A1 Feb 2011 US