Claims
- 1. A method of encoding a request sequence from an automatic repeat request system, the method comprising the steps of:
receiving said request sequence from said automatic repeat request system, each request sequence including a sequence of binary symbols; counting a number of 0s and 1s in said sequence of binary symbols in accordance with a Markov model; estimating a probability associated with each binary symbol based on the number of 0s and 1s counted in each state of said Markov model; and encoding said binary symbols on a symbol-by-symbol basis using said probability estimates.
- 2. The method of encoding according to claim 1, wherein said Markov model is a Markov chain of order two with states “11” and “01” joined together into one state “1”.
- 3. The method of encoding according to claim 1, wherein said Markov model is a Context Tree of depth two with states “11” and “01” joined together into one state “1”.
- 4. The method of encoding according to claim 1, wherein said encoding step includes entropy encoding.
- 5. The method of encoding according to claim 1, wherein said entropy encoding includes arithmetic symbol encoding.
- 6. The method of encoding according to claim 1, further comprising including a length of said sequence of binary symbols along with said encoded binary symbols.
- 7. The method of encoding according to claim 1, wherein said step of encoding includes a lossy encoding.
- 8. A method of decoding an encoded request sequence from an automatic repeat request system, the method comprising the steps of:
receiving said encoded request sequence from said automatic repeat request system; decoding said encoded request sequence into a sequence of binary symbols on a symbol-by-symbol basis; counting a number of 0s and 1s in said sequence of binary symbols in accordance with a Markov model; and estimating a probability associated with each binary symbol based on the number of 0s and 1s counted in each state of said Markov model.
- 9. The method of decoding according to claim 8, wherein said Markov model is a Markov chain of order two with states “11” and “01” joined together into one state “1”.
- 10. The method of decoding according to claim 8, wherein said Markov model is a Context Tree of depth two with states “11” and “01” joined together into one state “1”.
- 11. The method of decoding according to claim 8, wherein said decoding step includes entropy decoding.
- 12. The method of decoding according to claim 11, wherein said entropy decoding includes arithmetic symbol decoding.
- 13. The method of decoding according to claim 8, wherein a length of said sequence of binary symbol is included with said encoded request sequence.
- 14. The method of decoding according to claim 8, wherein said step of encoding includes a lossy encoding.
- 15. An encoder for encoding a request sequence from an automatic repeat request system, comprising:
a probability estimation module configured to receive said request sequence including a sequence of binary symbols from said automatic repeat request system, said probability estimation module further configured to count a number of 0s and 1s in said sequence of binary symbols in accordance with a Markov model, and to estimate a probability associated with each binary symbol based on the number of 0s and 1s counted in each state of said Markov model; and an encoder module, said encoder module configured to encode said binary symbols on a symbol-by-symbol basis using said probability estimates.
- 16. The encoder according to claim 15, wherein said Markov model is a Markov chain of order two with states “11” and “01” joined together into one state “1”.
- 17. The encoder according to claim 15, wherein said Markov model is a Context Tree of depth two with states “11” and “01” joined together into one state “1”.
- 18. The encoder according to claim 15, wherein said encoder module includes an entropy encoder module.
- 19. The encoder according to claim 18, wherein said entropy encoder module includes an arithmetic symbol encoder module.
- 20. The encoder according to claim 15, wherein said encoder module is further configured to include a length of said sequence of binary symbols along with said encoded binary symbols.
- 21. The encoder according to claim 15, wherein said encoder module is a lossy encoder module.
- 22. A decoder for decoding an encoded request sequence from an automatic repeat request system, comprising:
a decoder module configured to receive said encoded request sequence and to decode said encoded request sequence into a sequence of binary symbols, wherein said decoder module decodes said encoded request sequence on a symbol-by-symbol basis; and a probability estimation module configured to count a number of 0s and 1s in said sequence of binary symbols in accordance with a Markov model, and to estimate a probability associated with each binary symbol based on the number of 0s and 1s counted in each state of said Markov model.
- 23. The decoder according to claim 22, wherein said Markov model is a Markov chain of order two with states “11” and “01” joined together into one state “1”.
- 24. The decoder according to claim 22, wherein said Markov model is a Context Tree of depth two with states “11” and “01” joined together into one state “1”.
- 25. The decoder according to claim 22, wherein said decoder module includes an entropy decoder module.
- 26. The decoder according to claim 25, wherein said entropy decoder module includes an arithmetic symbol decoder module.
- 27. The decoder according to claim 12, wherein a length of said sequence of binary symbol is included with said encoded request sequence.
- 28. The decoder according to claim 22, wherein said decoder module includes a lossy decoder module.
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is related to, claims priority from, and hereby incorporates by reference, U.S. Provisional Application No. 60/248,163, entitled “The Qualitative Modeling and Compression of the Request Sequences in ARQ Protocols,” filed with the U.S. Patent and Trademark Office on Nov. 13, 2000.