The present invention relates generally to wireless communications systems, and more specifically to data compression methods of wireless video data.
As the bandwidth of wireless systems expands, the ability to transmit digital video over wireless links has become easier and more cost effective. However, the data capacity of such wireless systems may vary over time depending on the surrounding conditions. This creates a significant challenge for developing a low-cost design that provides sufficient quality of service for consumer video applications.
Typical video compression techniques require buffering of multiple frames of data for the compression and decompression steps. Unfortunately, this buffering requires significant memory, which can add considerable cost to the system.
Furthermore, video compression techniques themselves tend to be variable rate, as the entropy of the source data tends to change on a line-by-line or frame-by-frame basis. This variable entropy also necessitates a dynamic adaptation algorithm.
Current video transmission systems use standard compression techniques such as MPEG2 (Motion Pictures Expert Group) and JPEG2000 (Joint Photographic Experts Group). MPEG2 uses compression techniques that take advantage of similarity between pixels as well as similarities between multiple frames of data. Because of this approach, any change in the compression algorithm would need to be made after several frames have been transmitted. This delay requires significant amounts of data to be buffered in the transmission system.
JPEG2000 uses compression techniques that are executed over an entire frame of data. This too requires at one complete frame of data to be transmitted before changing the compression algorithm.
Existing compression systems also use a fixed quantization and compression mechanism to enable either a constant bit rate data stream or a “variable” bit rate data stream. The definition of “variable” in this case is dictated by the algorithm, and there is no feedback mechanism from the channel to change the bit rate based upon channel conditions.
To enable low cost wireless transport of digital video, it would be desirable to have a method to dynamically adapt the data rate required for the video while buffering a small percentage of a single frame of video. This dynamic adjustment of the data rate should take place without any interruption of service to the viewer.
The present invention provides a method and apparatus for compressing video data for wireless transmission. The invention continually monitors the output of a video data encoder and maintains a running average for the data output rate over multiple lines of video data. The encoder compresses video data on a line-by-line basis. The invention also monitors the occupancy of the data buffer and as well as the performance of the wireless subsystem to determine real time available channel capacity. If the average data rate exceeds the available channel capacity, the invention increases data compression on subsequent lines of video data until the average data rate falls within the channel capacity. If the average data output rate is less than the channel capacity, the invention reduces data compression on subsequent lines of video data until the average data rate increases to match the channel capacity.
The novel features believed characteristic of the invention are set forth in the appended claims. The invention itself, however, as well as a preferred mode of use, further objects and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying drawings, wherein:
The present invention provides a method for compressing video for transmission over a wireless link. An encoder is responsible for taking uncompressed video data and encoding it into compressed packets for transmission over the link. A decoder is responsible for extracting the information from the received packets, uncompressing it, and presenting it as uncompressed video.
The encoder uses several techniques for compressing the video data. These included:
Color space conversion (RGB to YCbCr) with sub-sampling of CbCr
Wavelet transform to enable efficient entropy encoding
Quantization to vary the level of compression
Entropy encoding
The Select YUV control 303 determines whether the YUV Format Conversion block 304 is bypassed or whether lowpass filters and downsamplers will convert YUV 4:4:4 format to YUV 4:2:2 or YUV 4:1:1 format.
The YUV components produced by the YUV Format Conversion Block 304 are passed through separate wavelet transforms 306(Y), 307(U) and 308(V). The SelectTfm control 305 determines whether two-level integer versions of the Daubechies (7,9) or the LeGall (5,3) wavelet transforms are used to generate lowpass and highpass subbands for each color-component. Because two-level wavelet transforms are used, for a given line, the number of lowpass (high priority) wavelet coefficients constitutes 25% of the pixels in the line, and highpass (low priority) wavelet coefficients constitute 75% of the pixels in the line.
For the Y-component, the lowpass subband is labeled LpY, and the highpass subband is labeled HpY. Similarly, the corresponding subbands for the U and V components are labeled LpU, HpU, LpV, HpV respectively. The LpY, HpY, LpU, HpU, LpV, HpV subbands for a given line are stored in a RAM 309 because video rate control requires access to all subbands for a given line to determine optimal quantization settings for those subbands.
Once the wavelet transforms are completed, video rate control determines the optimal quantization levels qLY, qHY, qLU, qHU, qLV, qHV for the respective subbands LpY, HpY, LpU, HpU, LpV, HpV. The respective pairs of subband coefficients and quantization levels (e.g., LpY/qLY, HpY/qHY, etc.) are fed into quantizers 310-315, which calculate the quantized coefficient xq by the following process:
The above pseudocode shows that the quantizer eliminates the q least significant bits of x by rightshifting, to get xq. To obtain shorter entropy codes, negative values of xq are incremented so that negative numbers are rounded towards zero.
After quantization, the data is fed into entropy encoders 316-321, and the invention labels the entropy-coded LpY′, LpU′, LpV′, HpY′, HpU′, and HpV′ data as HiPriY, HiPriU, HiPriV, LoPriY, LoPriU, LoPriV, respectively.
The present invention can be implemented using either lossless or lossy compression. The example system described above produces lossy compression. To achieve lossless compression, the quantizers 310-315 would be absent, and the data would move directly from the wavelet transforms 306-308 to the entropy encoders 316-321.
Compression is performed over a single line or a few (e.g., <8) lines of video. The compression ratio is set by allowing different settings for the sub-sampling and quantization. These settings can be changed as needed to ensure the video data rate does not exceed the capacity of the wireless link. The settings can also be changed several times during a single frame of video. A Control Entity is used to adjust the video compression parameters.
By monitoring the activity of the above components, the Control Entity can adjust multiple parameters to set the compression ratio for each line of video. The Control Entity contains a monitor 504 that collects information from three sources to make a decision regarding the compression settings. The information includes:
The parameters that can be adjusted in the compression algorithm are color space sub-sampling and quantization. The quantization level can be set independently for each color component as well as separately for highpass and lowpass information out of the wavelet transform.
A table is maintained at both the encoder and decoder that contains multiple sets of the adjustable encoder parameters. Each set corresponds to a different compression ratio and video quality. The index to this table is used to adjust the compression ratio and video quality. The index used to encode any given packet is placed in the header of the packet. The decoder then uses this index to enable the correct decoding of the packet. The encoder can simply increment or decrement the index to adjust the compression ratio.
The Control Entity continually monitors the rate of data coming out of the encoder and calculates a running average of the encoder output data rate (step 601). The Control Entity also monitors the current buffer occupancy (step 602) and the wireless subsystem (step 603) to determine the current buffer and channel capacity.
As the channel capacity of the wireless link fluctuates due to interference, rather than re-encoding data to meet the exact data rate requirement, the Control Entity maintains the average data rate at the output of the encoder, while simultaneously minimizing the variance. This is achieved by determining if the average data rate is above or below the current channel capacity (step 604). If the average data rate is higher than the channel capacity, subsequent lines are more heavily compressed to bring the average down (step 605). If the encoder output rate is below the channel capacity, the compression level may be reduced to improve the quality of the image (step 606). This process allows the compression ratio to be adjusted on the fly on a line-by-line basis.
The channel capacity may also be replaced with some other data rate target that may be below the true channel capacity to enable a margin and thereby minimize errors.
The present invention can significantly lower the amount of buffering required in the wireless link to transmit high quality video. This is accomplished in two ways. First, since the compression algorithm is run over a small number of lines, the amount of buffering required is limited to these lines. Second, the quality and capacity of a wireless link can change over time. To deal with these changes, prior art systems use additional buffering to accommodate short “outages” on the link. The present invention simply increases the compression ratio when needed to keep an outage from affecting the overall video picture.
The present invention also enables a wireless link to provide video displays at lower quality when the link becomes poor. Rather than having a complete outage of video, the wireless link can change the encoder settings to enable a much lower data rate while still maintaining a visible picture. This can allow the system to provide information to the user to “fix” the wireless link if needed.
The description of the present invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. The embodiment was chosen and described in order to best explain the principles of the invention, the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated. It will be understood by one of ordinary skill in the art that numerous variations will be possible to the disclosed embodiments without going outside the scope of the invention as disclosed in the claims.
This application claims the benefit of and priority to U.S. Provisional Patent Application No. 60/642,737 filed Jan. 10, 2005 the technical disclosures of which are hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4764805 | Rabbani et al. | Aug 1988 | A |
5727092 | Sandford et al. | Mar 1998 | A |
6256350 | Bishay et al. | Jul 2001 | B1 |
7016337 | Wu et al. | Mar 2006 | B1 |
7023915 | Pian et al. | Apr 2006 | B2 |
7106909 | Satoh et al. | Sep 2006 | B2 |
7170938 | Cote et al. | Jan 2007 | B1 |
7302102 | Reynolds et al. | Nov 2007 | B2 |
7714747 | Fallon | May 2010 | B2 |
7860345 | Satoh et al. | Dec 2010 | B2 |
7876821 | Li et al. | Jan 2011 | B2 |
8165203 | Fernandes | Apr 2012 | B2 |
8208554 | Fernandes | Jun 2012 | B2 |
20020090027 | Karczewicz et al. | Jul 2002 | A1 |
20030174769 | Nagumo et al. | Sep 2003 | A1 |
20040197024 | Bobichon et al. | Oct 2004 | A1 |
20060067408 | Roman et al. | Mar 2006 | A1 |
20060109339 | Chao et al. | May 2006 | A1 |
20070053428 | Saleem et al. | Mar 2007 | A1 |
20080123739 | Reznic et al. | May 2008 | A1 |
Number | Date | Country | |
---|---|---|---|
20060153291 A1 | Jul 2006 | US |
Number | Date | Country | |
---|---|---|---|
60642737 | Jan 2005 | US |