The present disclosure generally relates to techniques for controlling the encoding of a bitstream, such as a media stream, that is transmitted over a data connection.
Recently, consumers have expressed significant interest in “place shifting” devices that allow viewing of television or other media content at locations other than their primary television set. Place shifting devices typically packetize media content that can be transmitted over a local or wide area network to a portable computer, mobile phone, personal digital assistant, remote television or other remote device capable of playing back the packetized media stream for the viewer. Placeshifting therefore allows consumers to view their media content from remote locations such as other rooms, hotels, offices, and/or any other locations where portable media player devices can gain access to a wireless or other communications network.
While placeshifting does greatly improve the convenience afforded to the viewer, challenges can arise in effectively creating and transmitting the encoded media stream. The variety of network environments (e.g., LAN, WAN, Internet, wireless telephone, etc.) that may be supported can lead to significant variations in encoding parameters over time. Moreover, digital networks, particularly those based on Ethernet and/or TCP/IP-type protocols, are inherently unpredictable, which can lead to difficulties in selecting particular encoding parameters to be used in creating and transmitting the media stream over any particular network. Moreover, network conditions can change very rapidly, thereby leading to difficulties in maintaining encoding parameters that are both efficient and current.
It is therefore desirable to create systems and methods for controlling the encoding of a media stream that is transmitted over a network or other data connection. These and other desirable features and characteristics will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and this background section.
According to various exemplary embodiments, systems and methods are described for providing a media stream transmitted from an encoding system to a remotely-located media player. In an exemplary method, the media stream is encoded according to an encoding parameter. Data is gathered about a transmit buffer within the encoding system, and the gathered data is processed to determine an estimate of network capacity and a calculated encoder rate. The encoding parameter is adjusted during subsequent encoding as a function of in at least one of the estimate of network capacity and the calculated encoder rate.
In other embodiments, a media encoding system is provided for providing a media stream to a remote player over a network. The media encoding system comprises a media encoding module configured to receiving an input signal and to encode the media stream according to an encoding parameter, a network interface configured to transmit the encoded media stream to the remote player over the network, a transmit buffer configured to be filled by the media encoding module and emptied by the network interface; and a control module. The control module is configured to gather data about the transmit buffer, to process the gathered data to determine an estimate of a network capacity and a calculated encoder rate, and to adjust the encoding parameter during subsequent encoding as a function of at least one of the estimate of network capacity and the calculated encoder rate.
Various embodiments, aspects and other features are described in more detail below.
Exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and
The following detailed description of the invention is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.
According to various embodiments, the efficiency and effectiveness of media stream encoding can be greatly improved by considering the particular values of the current encoder rate and the capacity of a transmit buffer located between the encoder and the network, as described more fully below. The buffer's occupancy behavior can be a relatively good indicator of actual network capacity and behavior. As a result, by considering the actual rate at which the media stream is being created with respect to the actual rate that the stream is being transmitted on a network, system behavior can be significantly improved in comparison to conventional techniques.
In particular, at least two issues can arise during the encoding and transmission process. First, if the data encoder has a wider range of operation above and below the capacity of the data connection, a bottleneck in system throughput can occur if the encoder provides data at a faster or slower rate than the network transmission rate. If the encoder operates at a faster rate than the network transmission rate, then the buffer between the encoder and the network will fill until it reaches its capacity. Conversely, if the encoder operates slower than the network transmission rate, then the capability of the network becomes underutilized.
Secondly, issues can occur in assigning processing resources between the encoder module and the network processing module within the device that can lead to sub-optimal performance. For example, if the encoder consumes an unduly large amount of processor time to produce the media stream, the remaining processor time may be insufficient for the network module to transmit the data, thereby resulting in loss of data and/or a degraded user experience. Similarly, if the encoder generates data at a less-than-optimal rate, the quality of the encoded signal can also be less than optimal, thereby reducing the demands placed upon the network transmission module. In this case, the processor is under-utilized, thereby again leading to a degraded user experience. Hence, it is desirable in many implementations to balance the processor allocation between the encoder and the network modules in a manner that allows both the encoder and network processing modules to operate at a capacity that provides the best possible user experience.
By addressing the actual rates by which the transmit buffer is filled and emptied, much more precise control over the encoding and transmitting processes can be achieved, resulting in better resource allocation and/or more efficient resource utilization. Various embodiments may address either or both of these issues, and/or may provide other features as desired.
Turning now to the drawing figures and with initial reference to
Placeshifting encoder system 102 is any component, hardware, software logic and/or the like capable of transmitting a packetized stream of media content over network 110. In various embodiments, placeshifting device 102 incorporates suitable encoder and/or transcoder (collectively “encoder”) logic to convert audio/video or other media data 122 into a packetized format that can be transmitted over network 110. The media data 122 may be received in any format, and may be received from any internal or external source 106 such as any sort of broadcast, cable or satellite television programming source, a “video-on-demand” or similar source, a digital video disk (DVD) or other removable media, a video camera, and/or the like. Encoder system 102 encodes media data 122 to create media stream 120 in any manner. In various embodiments, encoder system 102 contains a transmit buffer 105 that temporarily stores encoded data prior to transmission on network 110. As buffer 105 fills or empties, one or more parameters of the encoding (e.g., the bit rate of media stream 120) may be adjusted to maintain desirable picture quality and data throughput in view of the then-current network performance. As described more fully below, various embodiments are able to calculate a current encoding rate and a current network transfer rate, and are able to adjust the encoding rate as the network transfer rate changes. Changes in the network transfer rate may be identified from, for example, changes in the utilization of the outgoing buffer 105.
Several examples of encoding systems 102 may be implemented using any of the various SLINGBOX products available from Sling Media of Foster City, Calif., although other products could be used in other embodiments. Many different types of encoder systems 102 are generally capable of receiving media content 122 from an external source 106 such as any sort of digital video recorder (DVR), set top box (STB), cable or satellite programming source, DVD player, and/or the like. In such embodiments, encoder system 102 may additionally provide commands 124 to the source 106 to produce desired signals 122. Such commands 124 may be provided over any sort of wired or wireless interface, such as an infrared or other wireless transmitter that emulates remote control commands receivable by the source 106. Other embodiments, however, particularly those that do not involve placeshifting, may modify or omit this feature entirely.
In other embodiments, encoder system 102 may be integrated with any sort of content receiving or other capabilities typically affiliated with source 106. Encoder system 102 may be a hybrid STB or other receiver, for example, that also provides transcoding and placeshifting features. Such a device may receive satellite, cable, broadcast and/or other signals that encode television programming or other content received from an antenna, modem, server and/or other source. The receiver may further demodulate or otherwise decode the received signals to extract programming that can be locally viewed and/or place shifted to a remote player 104 as appropriate. Such devices 102 may also include a content database stored on a hard disk drive, memory, or other storage medium to support a personal or digital video recorder (DVR) feature or other content library as appropriate. Hence, in some embodiments, source 106 and encoder system 102 may be physically and/or logically contained within a common component, housing or chassis.
In still other embodiments, encoder system 102 is a software program, applet or the like executing on a conventional computing system (e.g., a personal computer). In such embodiments, encoder system 102 may encode, for example, some or all of a screen display typically provided to a user of the computing system for placeshifting to a remote location. One device capable of providing such functionality is the SlingProjector product available from Sling Media of Foster City, Calif., which executes on a conventional personal computer, although other products could be used as well.
Media player 104 is any device, component, module, hardware, software and/or the like capable of receiving a media stream 120 from one or more encoder systems 102. In various embodiments, remote player 104 is personal computer (e.g., a “laptop” or similarly portable computer, although desktop-type computers could also be used), a mobile phone, a personal digital assistant, a personal media player (such as the ARCHOS products available from the Archos company of Igny, France) or the like. In many embodiments, remote player 104 is a general purpose computing device that includes a media player application in software or firmware that is capable of securely connecting to placeshifting encoder system 102, as described more fully below, and of receiving and presenting media content to the user of the device as appropriate. In other embodiments, however, media player 104 is a standalone or other separate hardware device capable of receiving the media stream 120 via any portion of network 110 and decoding the media stream 120 to provide an output signal 126 that is presented on a television or other display 108. One example of a standalone media receiver 104 is the SLINGCATCHER product available from Sling Media of Foster City, Calif., although other products could be equivalently used.
Network 110 is any digital or other communications network capable of transmitting messages between senders (e.g., encoder system 102) and receivers (e.g., receiver 104). In various embodiments, network 110 includes any number of public or private data connections, links or networks supporting any number of communications protocols. Network 110 may include the Internet, for example, or any other network based upon TCP/IP or other conventional protocols. In various embodiments, network 110 also incorporates a wireless and/or wired telephone network, such as a cellular communications network for communicating with mobile phones, personal digital assistants, and/or the like. Network 110 may also incorporate any sort of wireless or wired local area networks, such as one or more IEEE 802.3 and/or IEEE 802.11 networks.
Encoder system 102 and/or player 104 are therefore able to communicate with player 104 in any manner (e.g., using any sort of data connections 128 and/or 125, respectively). Such communication may take place over a wide area link that includes the Internet and/or a telephone network, for example; in other embodiments, communications between devices 102 and 104 may take place over one or more wired or wireless local area links that are conceptually incorporated within network 110. In various equivalent embodiments, encoder system 102 and receiver 104 may be directly connected via any sort of cable (e.g., an Ethernet cable or the like) with little or no other network functionality provided.
Many different placeshifting scenarios could be formulated based upon available computing and communications resources, consumer demand and/or any other factors. In various embodiments, consumers may wish to placeshift content within a home, office or other structure, such as from a placeshifting encoder system 102 to a desktop or portable computer located in another room. In such embodiments, the content stream will typically be provided over a wired or wireless local area network operating within the structure. In other embodiments, consumers may wish to placeshift content over a broadband or similar network connection from a primary location to a computer or other remote player 104 located in a second home, office, hotel or other remote location. In still other embodiments, consumers may wish to placeshift content to a mobile phone, personal digital assistant, media player, video game player, automotive or other vehicle media player, and/or other device via a mobile link (e.g., a GSM/EDGE or CDMA/EVDO connection, any sort of 3G or subsequent telephone link, an IEEE 802.11 “Wi-fi” link, and/or the like). Several examples of placeshifting applications available for various platforms are provided by Sling Media of Foster City, Calif., although the concepts described herein could be used in conjunction with products and services available from any source.
Encoder system 102, then, generally creates a media stream 120 that is routable on network 110 based upon content 122 received from media source 106. To that end, and with reference now to
In the exemplary embodiment shown in
As noted above, creating a media stream 120 typically involves encoding and/or transcoding an input media stream 122 received from an internal or external media source 106 into a suitable digital format that can be transmitted on network 110. Generally, the media stream 120 is placed into a standard or other known format (e.g., the WINDOWS MEDIA format available from the Microsoft Corporation of Redmond, Wash. although other formats such as the QUICKTIME format, REALPLAYER format, MPEG format, and/or the like could be used in any other embodiments) that can be transmitted on network 110. This encoding may take place, for example, in any sort of encoding module 202 as appropriate. Encoding module 202 may be any sort of hardware (e.g., a digital signal processor or other integrated circuit used for media encoding), software (e.g., software or firmware programming used for media encoding that executes on the SoC or other processor described above), or the like. Encoding module 202 is therefore any feature that receives media data 122 from the internal or external source 106 (e.g., via any sort of hardware and/or software interface) and encodes or transcodes the received data into the desired format for transmission on network 110. Although
In various embodiments, encoder 202 may also apply other modifications, transforms and/or filters to the received content before or during the transcoding process. Video signals, for example, may be resized, cropped and/or skewed. Similarly, the color, hue and/or saturation of the signal may be altered, and/or noise reduction or other filtering may be applied. Audio signals may be modified by adjusting volume, sampling rate, mono/stereo parameters, noise reduction, multi-channel sound parameters and/or the like. Digital rights management encoding and/or decoding may also be applied in some embodiments, and/or other features may be applied as desired.
As noted above, one or more parameters of the encoding process (e.g., the bit rate, frame rate, image resolution and/or other parameters) may be adjusted during the encoding process to produce a media stream 120 that is modified or tuned to the then-current capabilities of network 110. The encoding bit rate, for example, can be adjusted in response to a measured capability of network 110. That is, bit rate may be increased when network conditions are able to accommodate the larger bandwidth consumption associated with the higher rate; conversely, bit rate can be decreased when network conditions are less able to accommodate the greater demands. By adjusting the bit rate of the encoding process in response to the network performance, the user experience can be greatly improved.
Network interface 206 refers to any hardware, software and/or firmware that allows encoding system 102 to communicate on network ii. In various embodiments, network interface 206 includes suitable network stack programming and other features and/or conventional network interface (NIC) hardware such as any wired or wireless interface as desired.
In various embodiments, control module 205 monitors and controls the encoding and transmit processes performed by encoding module 202 and network interface 206, respectively. To that end, control module 205 is any hardware, software, firmware or combination thereof capable of performing such features. In various embodiments, control module 205 further processes commands received from the remote player via network interface 206 (e.g., by sending commands 124 to the media source 106 via a command module 208 or the like). Control module 205 may also transmit commands to the remote player 104 via network interface 206 and/or may control or otherwise effect any other operations of the encoder system 102. In various embodiments, control module 205 implements the control features used to monitor and adjust the operation of encoder 202 and/or network interface 106 to efficiently provide the media stream to remote player 104.
One technique for monitoring the capability of network 110 involves monitoring the fullness and/or utilization of a buffer 105 in the encoder system 102 as well as the rates at which the buffer 105 fills and empties. Buffer 105 is typically any sort of hardware and/or software feature capable of temporarily storing encoded data prior to transmission on network 110. Typically, buffer 105 is implemented in a portion of memory or mass storage associated with encoder 101. If buffer 105 remains relatively empty, then the data transmit rate can be deduced to be relatively well-matched to the data encoding rate. That is, data is being transmitted at approximately the same rate that it is being encoded. If the buffer 105 is filling, however, this indicates that the encoder is generating data faster than the network 110 can transmit the data.
In a conventional environment, the boundaries between various states 302-310 may represent “watermarks” that indicate transition points wherein the bit rate (or other encoding parameter) is adjusted upwardly and/or downwardly. As the buffer utilization moves from the “yellow” state 306 to the “green” state 304, for example, bit rate may be increased to take advantage of the excess capacity. Similarly, bit rate may be decreased as the buffer utilization breaches a watermark from the top (as shown in
In various embodiments, this broad concept of watermarks can be expanded to even further improve performance. Rather than limiting parameter changes to occurrences of watermark breaches, for example, adjustments may be made within categories (or otherwise) based upon the actual measured/observed values of the encoder rate and buffer capacity. For example, the encoder rate (or other appropriate parameter) may be adjusted based upon a specific amount that is tailored to the actual buffer conditions rather than simply adjusting in accordance with a pre-determined amount that was empirically or otherwise determined at a different time. Further, different actions may be taken at different levels of operation. More aggressive encoding (e.g., increasing encoding bitrate), for example, can be performed at the higher levels (e.g., “bonus” level 302 or “green” level 304) in comparison to the lower levels. Hence, it may be possible and desirable to increase the bitrate (for example) during “green” operation even if a watermark has not been crossed. Similarly, it may be desirable to maintain the bitrate (or other parameter) at a lower level during “red” or “yellow” operation, even following a watermark transition, to let the buffer empty somewhat before making more aggressive parameter changes. Rather than simply adjusting bitrate or other parameters upwardly or downwardly in response to watermark transitions, then, more advanced processing can be used to obtain better performance and/or better asset utilization. Additional details of an exemplary implementation are set forth below.
Some embodiments may further retain the use of watermark structures (such as the structure 300 shown in
Turning now to
Generally speaking, watermarks may be adjusted on any regular or irregular basis (step 402). In various embodiments, watermarks are recalculated (step 404) on any regular temporal basis (e.g., every second or so), any irregular logical basis (e.g., every n-th iteration of method 400, such as every tenth iteration or so), or on any other basis as desired. In an exemplary embodiment, method 400 repeats at a frequency of about every 100 ms or so (other embodiments may vary), with watermarks being recalculated on every tenth iteration or so, thereby resulting in recalculation about every second. Other embodiments may use other parameters, and/or may consider other factors as desired. For example, watermarks may be recalculated in some embodiments when it is known that the encoding parameter has been adjusted; conversely, some embodiments may omit recalculation when it is known that the encoding parameter has not been adjusted since the last recalculation. Other embodiments may supplement or modify step 402 in any manner.
Watermarks are calculated according to any suitable parameters, constraints or techniques (step 404). Generally speaking, the watermarks are calculated based upon the then-current value of the adjusted parameter (e.g., bit rate). In various embodiments, an average (or weighted average) of current and former parameter values may be used in determining the new watermarks, as appropriate. The amount of historical data used in creating the average can be determined in any manner, and historical data may be weighted as desired (e.g., so that older data is given less weight).
Data is gathered on any appropriate basis (step 406). In various embodiments, data is gathered on a relatively regular temporal basis (e.g., every 100 ms or so), although other embodiments may gather data on a more irregular or other basis. Data gathered in step 406 is any information that allows for the computation of network capacity, encoder bitrate and/or other factors as appropriate. In various embodiments, data is gathered relating to amount of network traffic (e.g., number of bytes) transferred since the last iteration of step 406, the amount of encoded data (e.g., in bytes) generated since the last iteration of step 406, and/or the current buffer occupancy level (e.g., expressed in bytes or as a percentage of the total buffer capacity). Other factors or parameters may be collected in any number of alternate embodiments.
The gathered data is then processed to arrive at an estimate of network capacity and a calculated encoder rate (step 408). In various embodiments, the values gathered in step 406 may be averaged over any appropriate period of time (e.g., a second or so) to reduce the effects of relatively short term transients that may occur. Network capacity may be calculated based upon the average amount of network traffic transferred over some appropriate period of recent time (e.g., a second or so), for example. The encoder rate may be similarly calculated based upon actual and/or average saved encoder rates over a relevant period of time (e.g., a second or so). Hence, by tracking the rates by which the buffer 105 is filled (e.g., the encoder rate) and emptied (e.g., the network transmit rate), any discrepancies between the two can be readily identified. Similarly, average buffer occupancy over a relatively recent period of time can be used to estimate the current zone of operation. In various embodiments, the zone of operation may be adjusted based upon an average of fewer samples (e.g., representing a shorter period of time, such as about 0.5 sec or so) than the average time window used for the other calculations performed. The averages used to compute the various values may change with each iteration of method 400 so that the averages are effectively “sliding” averages, as appropriate. System processor load may also be considered in any manner. In various embodiments, processor over and/or under utilization can be tracked and compensated in any manner.
Changes in the encoding parameter (e.g., bit rate) may be made in any manner (steps 410 and 412). In various embodiments, the encoding parameter is evaluated upon each iteration of method 400 (e.g., on the order of every 100 ms or so) according to various criteria. In the exemplary embodiment shown in
If a parameter adjust is warranted, the particular encoding parameter may be adjusted upwardly or downwardly as desired (step 412). Encoding bit rate, for example, may be increased or decreased to reflect increased or decreased network capability, for example. The particular amount of change will depend upon the particular embodiment and various other factors. For example, bit rate changes may increase more aggressively when the buffer 105 is relatively empty, since buffer space is available to compensate for any over-aggressiveness. Conversely, rate increases may be applied much more conservatively when buffer 105 is relatively full. Other factors may be considered as appropriate.
In many embodiments, it may be undesirable to make too frequent changes to the encoding parameter. Maintaining at least some time interval between parameter changes can allow effective buffer utilization, and can reduce any transient effects of short-lived changes in network capacity and/or processor load, thereby improving the user experience. This delay in time changes is reflected in a “time interval” parameter in
In addition to allowing changes in the encoding parameter in response to breaching of a watermark in step 410, various embodiments further adjust the encoding parameters when conditions otherwise warrant (step 416). Adjustments may include any changes to the bit rate, frame rate, image resolution, audio or video quality, or any other parameters or features as desired. For example, the encoding parameter may be adjusted to match the network capacity calculations made in step 408. If the network rate is significantly greater or less than the encoding rate, for example, the encoding rate can be adjusted to match the network capability even if buffer utilization does not indicate a zone transition/watermark breach.
By adjusting the encoding parameters (e.g., the encoding bit rate) in response to the actual fill and empty rates of buffer 105, the performance of encoding system 102 can be substantially improved. By measuring and manipulating the fill and/or empty rates of buffer 105, (e.g., to adapt the fill and empty rates to each other as closely as possible) the user experience can be significantly improved. Moreover, by considering processor utilization in adjusting the encoding parameters, the processor can be more effectively utilized, thereby preventing issues that can result from imbalance.
The particular techniques used to adjust one or more encoding parameters can vary significantly from embodiment to embodiment using the general concepts set forth herein. One detailed implementation is presented below that is intended as one example of the sorts of features that could be implemented in an exemplary encoding system or technique. The various parameters and values used in this example, however, are strictly for purposes of illustration, and are not intended as being exclusive or otherwise limiting.
With reference again to
Parameter transitions may also occur within zones. Within bonus zone 302, for example, the encoding rate may be increased as network performance increases (e.g., to a maximum of 80% or so of the network rate, which may be constrained by the average or peak values observed in recent past network performance as described above). Such adjustments may be limited to appropriate time intervals, such as every two seconds or so as desired. Although 80% or so is an exemplary value for this particular illustration that is not intended to be limiting on all embodiments, that general value has been identified in practice as providing a particularly beneficial result in that it allows a relatively high utilization of available network resources during “bonus” or similar conditions while still providing a comfortable margin for subsequent degradation in network performance. “About 80%” as used herein generally encompasses a range of about 70-90% or so, although 75-85% or even 78-82% could be used in other embodiments, depending upon the particular levels of precision and tolerance desired for the particular application and setting.
The encoding parameter may also be increased based upon the occupancy of buffer 105, as desired. Within the “bonus” zone 302, for example, the encoder set value may be increased by five percent or so if the buffer empties, and increased more aggressively (e.g., by ten percent or so) if the buffer remains empty for an appropriate period of time (e.g., 500 ms or so). These adjustments may be constrained in time as desired: the smaller adjustment may only occur after the prior rate has been in place for some period of time (e.g., five seconds or so), whereas the more aggressive adjustment may occur after the rate has been steady for a shorter period (e.g., two seconds or so) since the buffer occupancy is observed over a period of time prior to the more aggressive adjustment in this example.
Other upward adjustments of the encoding parameter may be performed in any manner. In an exemplary embodiment, however, the encoding parameter is only adjusted upwardly in response to the buffer achieving a “bonus” level of performance (e.g., the network rate exceeding the configured and/or actual encoder rates by a significant margin) for conservative operation. Upward transitions between non-bonus performance zones, however, may be allowed in other embodiments as desired.
Downward adjustments may also occur in any manner. If the occupancy of buffer 105 during “bonus” operation suddenly increases (e.g., touches “red” zone 308 in
Other downward transitions could occur in other levels of performance as desired. If the system is currently operating in the “green” zone 304, for example, yet buffer occupancy increases dramatically (e.g., to the “red” zone 308, or about 70-85% full), then the encoding rate may be reduced to about eighty percent (or so) of the network and/or set rates (or the lesser of the network and set rates) as desired. This transition may occur after a second (or so) of time has elapsed since the prior adjustment in some embodiments. Other embodiments may additionally (or alternately) constrain the new encoder rate to prevent changes greater than an appropriate percentage (e.g., 50% or so) of the prior encoding rate when the buffer 105 becomes nearly full. That is, if a significant sway in the network rate imposes a significant downgrade, it may be desirable to respond to such situations without creating suddenly large shifts in the encoding rate. Such adjustments may be performed even if little or no delay has occurred after the prior adjustment (although some delay (e.g., 500 ms or so) may be imposed when the encoder rate is already relatively low to prevent excessive degradation to very low encoder rates).
In some embodiments, it may be desirable to prevent degradation during periods of slight transition. If the actual encoder rate is significantly greater (e.g., ten percent or so greater) than the set rate, it may be desirable to simply do nothing (i.e., maintain the current encoder rate) for a period of time, provided that the buffer 105 is not overly full and other conditions otherwise warrant. If the system temporarily degrades from green to yellow performance, for example, yet the actual encoder rate is still operating better than the set rate, it may be acceptable to retain the green encoder rate until conditions otherwise warrant an adjustment.
Less dramatic adjustments in the encoder rate may also be performed as desired. If the network rate is less than the set encoder rate yet the buffer utilization is steadily rising, for example, it may be desirable to decrease the set encoder rate by ten percent (or so) to respond to the downward trend. Such trends may be caused by temporary network congestion (or other network effects), however, so it may not be desirable to use the network rate in setting the newly-selected encoder rate during such times. Further, it may be desirable in some instances to prevent inordinately large shifts in the encoder rate (e.g., shifts greater than 30% or so) for such transitions. Adjustments of this sort might not be performed until an appropriate delay (e.g., a second or so) has occurred following a prior adjustment.
In the preceding example, the encoding parameter (e.g., the bit rate of an encoded media stream or the like) can be adjusted based upon specific values that are observed and/or computed based upon the rate at which the buffer 105 fills and empties, thereby providing significantly greater flexibility and performance than prior techniques that relied solely upon watermark transitions or the like. This results in a significantly enhanced user experience.
As noted above, the particular adjustments, transitions, transition parameters, timing parameters and other specific features of the preceding example are intended solely for purposes of illustration, and are not meant to be limiting. Other embodiments may provide parameter rate adjustment techniques and systems that vary significantly from those described herein, and that use any number of alternate or additional parameters and parameter values.
The term “exemplary” is used herein to represent one example, instance or illustration that may have any number of alternates. Any implementation described herein as exemplary is not necessarily to be construed as preferred or advantageous over other implementations. While several exemplary embodiments have been presented in the foregoing detailed description, it should be appreciated that a vast number of alternate but equivalent variations exist, and the examples presented herein are not intended to limit the scope, applicability, or configuration of the invention in any way. To the contrary, various changes may be made in the function and arrangement of elements described without departing from the scope of the claims and their legal equivalents.
This application is a continuation in part of U.S. patent application Ser. No. 11/147,663 entitled “Personal media broadcasting system with output buffer” filed on Jun. 7, 2005, which claims the benefit of U.S. Provisional Application No. 60/577,833, filed Jun. 7, 2004. Both of these applications are incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
3416043 | Jorgensen | Dec 1968 | A |
4254303 | Takizawa | Mar 1981 | A |
5132992 | Yurt et al. | Jul 1992 | A |
5161021 | Tsai | Nov 1992 | A |
5237648 | Mills et al. | Aug 1993 | A |
5321846 | Yokota et al. | Jun 1994 | A |
5386493 | Degen et al. | Jan 1995 | A |
5434590 | Dinwiddie, Jr. et al. | Jul 1995 | A |
5493638 | Hooper et al. | Feb 1996 | A |
5602589 | Vishwanath et al. | Feb 1997 | A |
5661516 | Carles | Aug 1997 | A |
5666426 | Helms | Sep 1997 | A |
5682195 | Hendricks et al. | Oct 1997 | A |
5706290 | Shaw et al. | Jan 1998 | A |
5708961 | Hylton et al. | Jan 1998 | A |
5710605 | Nelson | Jan 1998 | A |
5722041 | Freadman | Feb 1998 | A |
5757416 | Birch et al. | May 1998 | A |
5774170 | Hite et al. | Jun 1998 | A |
5778077 | Davidson | Jul 1998 | A |
5794116 | Matsuda et al. | Aug 1998 | A |
5822537 | Katseff et al. | Oct 1998 | A |
5831664 | Wharton et al. | Nov 1998 | A |
5850482 | Meany et al. | Dec 1998 | A |
5852437 | Wugofski et al. | Dec 1998 | A |
5880721 | Yen | Mar 1999 | A |
5898679 | Brederveld et al. | Apr 1999 | A |
5909518 | Chui | Jun 1999 | A |
5911582 | Redford et al. | Jun 1999 | A |
5922072 | Hutchinson et al. | Jul 1999 | A |
5936968 | Lyons | Aug 1999 | A |
5968132 | Tokunaga | Oct 1999 | A |
5987501 | Hamilton et al. | Nov 1999 | A |
6002450 | Darbee et al. | Dec 1999 | A |
6008777 | Yiu | Dec 1999 | A |
6014694 | Aharoni et al. | Jan 2000 | A |
6020880 | Naimpally | Feb 2000 | A |
6031940 | Chui et al. | Feb 2000 | A |
6036601 | Heckel | Mar 2000 | A |
6040829 | Croy et al. | Mar 2000 | A |
6043837 | Driscoll, Jr. et al. | Mar 2000 | A |
6049671 | Slivka et al. | Apr 2000 | A |
6075906 | Fenwick et al. | Jun 2000 | A |
6088777 | Sorber | Jul 2000 | A |
6097441 | Allport | Aug 2000 | A |
6104334 | Allport | Aug 2000 | A |
6108041 | Faroudja et al. | Aug 2000 | A |
6115420 | Wang | Sep 2000 | A |
6117126 | Appelbaum et al. | Sep 2000 | A |
6141059 | Boyce et al. | Oct 2000 | A |
6141447 | Linzer et al. | Oct 2000 | A |
6160544 | Hayashi et al. | Dec 2000 | A |
6201536 | Hendricks et al. | Mar 2001 | B1 |
6212282 | Mershon | Apr 2001 | B1 |
6222885 | Chaddha et al. | Apr 2001 | B1 |
6223211 | Hamilton et al. | Apr 2001 | B1 |
6240459 | Roberts et al. | May 2001 | B1 |
6240531 | Spilo et al. | May 2001 | B1 |
6243596 | Kikinis | Jun 2001 | B1 |
6256019 | Allport | Jul 2001 | B1 |
6263503 | Margulis | Jul 2001 | B1 |
6279029 | Sampat et al. | Aug 2001 | B1 |
6282714 | Ghori et al. | Aug 2001 | B1 |
6286142 | Ehreth | Sep 2001 | B1 |
6310886 | Barton | Oct 2001 | B1 |
6340994 | Margulis et al. | Jan 2002 | B1 |
6353885 | Herzi et al. | Mar 2002 | B1 |
6356945 | Shaw et al. | Mar 2002 | B1 |
6357021 | Kitagawa et al. | Mar 2002 | B1 |
6370688 | Hejna, Jr. | Apr 2002 | B1 |
6389467 | Eyal | May 2002 | B1 |
6434113 | Gubbi | Aug 2002 | B1 |
6442067 | Chawla et al. | Aug 2002 | B1 |
6456340 | Margulis | Sep 2002 | B1 |
6466623 | Youn et al. | Oct 2002 | B1 |
6470378 | Tracton et al. | Oct 2002 | B1 |
6476826 | Plotkin et al. | Nov 2002 | B1 |
6487319 | Chai | Nov 2002 | B1 |
6493874 | Humpleman | Dec 2002 | B2 |
6496122 | Sampsell | Dec 2002 | B2 |
6505169 | Bhagavath et al. | Jan 2003 | B1 |
6510177 | De Bonet et al. | Jan 2003 | B1 |
6529506 | Yamamoto et al. | Mar 2003 | B1 |
6553147 | Chai et al. | Apr 2003 | B2 |
6557031 | Mimura et al. | Apr 2003 | B1 |
6564004 | Kadono | May 2003 | B1 |
6567984 | Allport | May 2003 | B1 |
6584201 | Konstantinou et al. | Jun 2003 | B1 |
6584559 | Huh et al. | Jun 2003 | B1 |
6597375 | Yawitz | Jul 2003 | B1 |
6598159 | McAlister et al. | Jul 2003 | B1 |
6600838 | Chui | Jul 2003 | B2 |
6609253 | Swix et al. | Aug 2003 | B1 |
6611530 | Apostolopoulos | Aug 2003 | B1 |
6628716 | Tan et al. | Sep 2003 | B1 |
6642939 | Vallone et al. | Nov 2003 | B1 |
6647015 | Malkemes et al. | Nov 2003 | B2 |
6658019 | Chen et al. | Dec 2003 | B1 |
6665751 | Chen et al. | Dec 2003 | B1 |
6665813 | Forsman et al. | Dec 2003 | B1 |
6697356 | Kretschmer et al. | Feb 2004 | B1 |
6701380 | Schneider et al. | Mar 2004 | B2 |
6704678 | Minke et al. | Mar 2004 | B2 |
6704847 | Six et al. | Mar 2004 | B1 |
6708231 | Kitagawa | Mar 2004 | B1 |
6718551 | Swix et al. | Apr 2004 | B1 |
6754266 | Bahl et al. | Jun 2004 | B2 |
6754439 | Hensley et al. | Jun 2004 | B1 |
6757851 | Park et al. | Jun 2004 | B1 |
6757906 | Look et al. | Jun 2004 | B1 |
6766376 | Price | Jul 2004 | B2 |
6768775 | Wen et al. | Jul 2004 | B1 |
6771828 | Malvar | Aug 2004 | B1 |
6774912 | Ahmed et al. | Aug 2004 | B1 |
6781601 | Cheung | Aug 2004 | B2 |
6785700 | Masud et al. | Aug 2004 | B2 |
6788882 | Geer et al. | Sep 2004 | B1 |
6795638 | Skelley, Jr. | Sep 2004 | B1 |
6798838 | Ngo | Sep 2004 | B1 |
6806909 | Radha et al. | Oct 2004 | B1 |
6807308 | Chui et al. | Oct 2004 | B2 |
6816194 | Zhang et al. | Nov 2004 | B2 |
6816858 | Coden et al. | Nov 2004 | B1 |
6826242 | Ojard et al. | Nov 2004 | B2 |
6834123 | Acharya et al. | Dec 2004 | B2 |
6839079 | Barlow et al. | Jan 2005 | B2 |
6847468 | Ferriere | Jan 2005 | B2 |
6850571 | Tardif | Feb 2005 | B2 |
6850649 | Malvar | Feb 2005 | B1 |
6868083 | Apostolopoulos et al. | Mar 2005 | B2 |
6889385 | Rakib et al. | May 2005 | B1 |
6892359 | Nason et al. | May 2005 | B1 |
6898583 | Rising, III | May 2005 | B1 |
6907602 | Tsai et al. | Jun 2005 | B2 |
6927685 | Wathen | Aug 2005 | B2 |
6930661 | Uchida et al. | Aug 2005 | B2 |
6941575 | Allen | Sep 2005 | B2 |
6944880 | Allen | Sep 2005 | B1 |
6952595 | Ikedo et al. | Oct 2005 | B2 |
6981050 | Tobias et al. | Dec 2005 | B1 |
7016337 | Wu et al. | Mar 2006 | B1 |
7020892 | Levesque et al. | Mar 2006 | B2 |
7032000 | Tripp | Apr 2006 | B2 |
7047305 | Brooks et al. | May 2006 | B1 |
7110558 | Elliott | Sep 2006 | B1 |
7124366 | Foreman et al. | Oct 2006 | B2 |
7151575 | Landry et al. | Dec 2006 | B1 |
7155734 | Shimomura et al. | Dec 2006 | B1 |
7155735 | Ngo et al. | Dec 2006 | B1 |
7184433 | Oz | Feb 2007 | B1 |
7224323 | Uchida et al. | May 2007 | B2 |
7239800 | Bilbrey | Jul 2007 | B2 |
7344084 | DaCosta | Mar 2008 | B2 |
7430686 | Wang et al. | Sep 2008 | B1 |
7464396 | Hejna, Jr. | Dec 2008 | B2 |
7502733 | Andrsen et al. | Mar 2009 | B2 |
7505480 | Zhang et al. | Mar 2009 | B1 |
7565681 | Ngo et al. | Jul 2009 | B2 |
7583676 | Shobatake | Sep 2009 | B2 |
20010021998 | Margulis | Sep 2001 | A1 |
20020004839 | Wine et al. | Jan 2002 | A1 |
20020010925 | Kikinis | Jan 2002 | A1 |
20020012530 | Bruls | Jan 2002 | A1 |
20020019984 | Rakib | Feb 2002 | A1 |
20020031333 | Mano et al. | Mar 2002 | A1 |
20020046404 | Mizutani | Apr 2002 | A1 |
20020053053 | Nagai et al. | May 2002 | A1 |
20020080753 | Lee | Jun 2002 | A1 |
20020090029 | Kim | Jul 2002 | A1 |
20020105529 | Bowser et al. | Aug 2002 | A1 |
20020112247 | Horner et al. | Aug 2002 | A1 |
20020122137 | Chen et al. | Sep 2002 | A1 |
20020131497 | Jang | Sep 2002 | A1 |
20020138843 | Samaan et al. | Sep 2002 | A1 |
20020143973 | Price | Oct 2002 | A1 |
20020147634 | Jacoby et al. | Oct 2002 | A1 |
20020147687 | Breiter et al. | Oct 2002 | A1 |
20020151992 | Hoffberg et al. | Oct 2002 | A1 |
20020167458 | Baudisch et al. | Nov 2002 | A1 |
20020188818 | Nimura et al. | Dec 2002 | A1 |
20020191575 | Kalavade et al. | Dec 2002 | A1 |
20030001880 | Holtz et al. | Jan 2003 | A1 |
20030028873 | Lemmons | Feb 2003 | A1 |
20030065915 | Yu et al. | Apr 2003 | A1 |
20030078973 | Przekop et al. | Apr 2003 | A1 |
20030088686 | Jennings | May 2003 | A1 |
20030093260 | Dagtas et al. | May 2003 | A1 |
20030095791 | Barton et al. | May 2003 | A1 |
20030115167 | Sharif et al. | Jun 2003 | A1 |
20030159143 | Chan | Aug 2003 | A1 |
20030187657 | Erhart et al. | Oct 2003 | A1 |
20030192054 | Birks et al. | Oct 2003 | A1 |
20030208612 | Harris et al. | Nov 2003 | A1 |
20030231621 | Gubbi et al. | Dec 2003 | A1 |
20040003406 | Billmaier | Jan 2004 | A1 |
20040052216 | Roh | Mar 2004 | A1 |
20040068334 | Tsai et al. | Apr 2004 | A1 |
20040083301 | Murase et al. | Apr 2004 | A1 |
20040100486 | Flamini et al. | May 2004 | A1 |
20040103340 | Sundareson et al. | May 2004 | A1 |
20040139047 | Rechsteiner et al. | Jul 2004 | A1 |
20040162845 | Kim et al. | Aug 2004 | A1 |
20040162903 | Oh | Aug 2004 | A1 |
20040172410 | Shimojima et al. | Sep 2004 | A1 |
20040205830 | Kaneko | Oct 2004 | A1 |
20040212640 | Mann et al. | Oct 2004 | A1 |
20040216173 | Horoszowski et al. | Oct 2004 | A1 |
20040236844 | Kocherlakota | Nov 2004 | A1 |
20040246936 | Perlman | Dec 2004 | A1 |
20040255249 | Chang et al. | Dec 2004 | A1 |
20050021398 | McCleskey et al. | Jan 2005 | A1 |
20050027821 | Alexander et al. | Feb 2005 | A1 |
20050038981 | Connor et al. | Feb 2005 | A1 |
20050044058 | Matthews et al. | Feb 2005 | A1 |
20050050462 | Whittle et al. | Mar 2005 | A1 |
20050053356 | Mate et al. | Mar 2005 | A1 |
20050055595 | Frazer et al. | Mar 2005 | A1 |
20050060759 | Rowe et al. | Mar 2005 | A1 |
20050097542 | Lee | May 2005 | A1 |
20050114852 | Chen et al. | May 2005 | A1 |
20050132351 | Randall et al. | Jun 2005 | A1 |
20050138560 | Lee et al. | Jun 2005 | A1 |
20050198584 | Matthews et al. | Sep 2005 | A1 |
20050204046 | Watanabe | Sep 2005 | A1 |
20050216851 | Hull et al. | Sep 2005 | A1 |
20050227621 | Katoh | Oct 2005 | A1 |
20050229118 | Chiu et al. | Oct 2005 | A1 |
20050246369 | Oreizy et al. | Nov 2005 | A1 |
20050251833 | Schedivy | Nov 2005 | A1 |
20050262534 | Bontempi et al. | Nov 2005 | A1 |
20050283791 | McCarthy et al. | Dec 2005 | A1 |
20050288999 | Lerner et al. | Dec 2005 | A1 |
20060011371 | Fahey | Jan 2006 | A1 |
20060031381 | Van Luijt et al. | Feb 2006 | A1 |
20060050970 | Gunatilake | Mar 2006 | A1 |
20060051055 | Ohkawa | Mar 2006 | A1 |
20060095401 | Krikorian et al. | May 2006 | A1 |
20060095471 | Krikorian et al. | May 2006 | A1 |
20060095472 | Krikorian et al. | May 2006 | A1 |
20060095942 | Van Beek | May 2006 | A1 |
20060095943 | Demircin et al. | May 2006 | A1 |
20060107226 | Matthews et al. | May 2006 | A1 |
20060117371 | Margulis | Jun 2006 | A1 |
20060146174 | Hagino | Jul 2006 | A1 |
20060280157 | Karaoguz et al. | Dec 2006 | A1 |
20070003224 | Krikorian et al. | Jan 2007 | A1 |
20070005783 | Saint-Hillaire et al. | Jan 2007 | A1 |
20070022328 | Tarra et al. | Jan 2007 | A1 |
20070074115 | Patten et al. | Mar 2007 | A1 |
20070076604 | Litwack | Apr 2007 | A1 |
20070168543 | Krikorian et al. | Jul 2007 | A1 |
20070180485 | Dua | Aug 2007 | A1 |
20070198532 | Krikorian et al. | Aug 2007 | A1 |
20070234213 | Krikorian et al. | Oct 2007 | A1 |
20070260462 | Andersen et al. | Nov 2007 | A1 |
20070286596 | Lonn | Dec 2007 | A1 |
20080019276 | Takatsuji et al. | Jan 2008 | A1 |
20080037573 | Cohen | Feb 2008 | A1 |
20080059533 | Krikorian | Mar 2008 | A1 |
20080134267 | Moghe et al. | Jun 2008 | A1 |
20080195744 | Bowra et al. | Aug 2008 | A1 |
20080199150 | Candelore | Aug 2008 | A1 |
20080294759 | Biswas et al. | Nov 2008 | A1 |
20080307456 | Beetcher et al. | Dec 2008 | A1 |
20080307462 | Beetcher et al. | Dec 2008 | A1 |
20080307463 | Beetcher et al. | Dec 2008 | A1 |
20090074380 | Boston et al. | Mar 2009 | A1 |
20090199248 | Ngo et al. | Aug 2009 | A1 |
20100100915 | Krikorian et al. | Apr 2010 | A1 |
Number | Date | Country |
---|---|---|
1464685 | Dec 2003 | CN |
1464685 | Dec 2003 | CN |
4407319 | Sep 1994 | DE |
0838945 | Apr 1998 | EP |
1077407 | Feb 2001 | EP |
1443766 | Aug 2004 | EP |
1691550 | Aug 2006 | EP |
1830558 | Sep 2007 | EP |
2307151 | May 1997 | GB |
2003046582 | Feb 2003 | JP |
2003114845 | Apr 2003 | JP |
2004015111 | Jan 2004 | JP |
2005032120 | Feb 2005 | JP |
19990082855 | Nov 1999 | KR |
20010211410 | Aug 2001 | KR |
0133839 | May 2001 | WO |
0147248 | Jun 2001 | WO |
0193161 | Dec 2001 | WO |
03026232 | Mar 2003 | WO |
03052552 | Jun 2003 | WO |
03098897 | Nov 2003 | WO |
2004032511 | Apr 2004 | WO |
2005050898 | Jun 2005 | WO |
2006064454 | Jun 2006 | WO |
2006074110 | Jul 2006 | WO |
2007027891 | Mar 2007 | WO |
2007051156 | May 2007 | WO |
2007141555 | Dec 2007 | WO |
2007149466 | Dec 2007 | WO |
2008024723 | Feb 2008 | WO |
Number | Date | Country | |
---|---|---|---|
20090103607 A1 | Apr 2009 | US |
Number | Date | Country | |
---|---|---|---|
60577833 | Jun 2004 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11147663 | Jun 2005 | US |
Child | 12339878 | US |