In digital communication, transmission of information sometimes produces error. Techniques of reducing and monitoring error have been developed. Error can be monitored and sometimes corrected. One area that has developed has centered around Hamming codes, and particularly shortened Hamming codes.
Hamming codes introduce redundancy in data by adding information to existing data to identify and correct error following transmission of the data. For example, appending an error correction code to a unit of data and transmitting the resulting codeword can allow for higher tolerance to noise and error.
Typically, a transmitter encodes a data unit to produce what is sometimes referred to as a “codeword.” The transmitter then sends the codeword to a receiver. Typically, a receiver decodes the codeword to obtain the original data unit and the error correction code. A decoder in the receiver may include a trellis representation of a Hamming code. A trellis representation is a view of a convolutional or block code explained using a trellis diagram.
In drawing a trellis, sets of states are used to represent all possible points which can be assumed at successive stages by a state machine, which is used to encode source data. Before sending, data is encoded into a codeword from a limited number of possible codewords including error correction data. Only a specific set of codewords is permitted for transmission. Upon receipt, a receiver implementing a trellis decoder decodes the codewords and provides the data to a communications system.
Once a codeword has been properly received, a trellis search algorithm, such as the Viterbi algorithm or the Bahl, Cocke, Jelinek, Raviv (BCJR) algorithm, can be used to decode the codeword. Notably, there is a large number of computational steps required to perform Viterbi or other trellis decoding. The complexity of a decoder based on the Viterbi or other trellis search algorithms may increase in complexity based on the size of the trellis structure corresponding to the decoder. The number of computational steps required to perform Viterbi or other trellis search decoding is related to the size of the trellis structure used to implement the decoder.
The following embodiments and aspects thereof are described and illustrated in conjunction with systems, tools, and methods that are meant to be exemplary and illustrative, not limiting in scope. In various embodiments, one or more of the above-described problems have been reduced or eliminated, while other embodiments are directed to other improvements. Advantageously, this technique can decrease complexity while reducing power consumption. It may even facilitate a reduction in die size.
The proposed decoding technique applies to any system that uses a shortened Hamming (15, 10) block code. This code is used, for example, in Bluetooth radios, where reduced decoding complexity is extremely important due to the need for low power.
Embodiments of the inventions are illustrated in the figures. However, the embodiments and figures are illustrative rather than limiting; they provide examples of the inventions.
In the following description, several specific details are presented to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or in combination with other components, etc. In other instances, well-known implementations or operations are not shown or described in detail to avoid obscuring aspects of various embodiments of the invention.
In the example of
It may be noted that the initial node 112 need not be treated as part of the initial stages 104 nodes (at least in part because it does not have one input branch like the one-hop to five-hop node sets). For similar reasons (i.e., the 10-hop nodes have different characteristics from the six- to nine-hop nodes), the 10-hop nodes may be treated as part of the final stages 108 nodes, rather than the intermediate stages 106 nodes, though at times it will be convenient to refer to the 10-hop nodes as part of the intermediate stages 106 nodes. For similar reasons (i.e., the final node 118 has no output branches), the final node 118 need not be treated as part of the final stages 108.
In the example of
In the example of
In the example of
As should be apparent from the example of
As a rule, in the traditional trellis representation 100, a set of i-hop nodes is an initial stage set of nodes if the set of i+1-hop nodes is a larger set; the set of i-hop nodes is an intermediate stage set of nodes if a set of i+1-hop nodes is an equal set; the set of i-hop nodes is a final stage set of nodes if a set of i+1-hop nodes is a smaller set. In general, such a rule cannot be rigidly applied to a minimal trellis representation, such as depicted later by way of example but not limitation in
An (N, K) block code with codewords of length N for K information bits can be represented as a punctured convolutional code and decoded using a time-invariant trellis with N sections and 2N−K states. The shortened Hamming (15, 10) code used in Bluetooth is a systematic code, so the first K=10 bits of a codeword are the information bits, and the remaining N−K=5 bits of a codeword are parity bits generated by modulo-2 addition of different combinations of the information bits. The codebook C has 210=1024 codewords. The generator polynomial for the shortened Hamming code (15,10) is
g(D)=(D+1)(D4+D+1)=D5+D4+D2+1.
The generator matrix can be found from the generator polynomial
This code has a minimum Hamming distance of 4.
The trellis for the shortened Hamming (15, 10) code viewed as a convolutional code has 32 states per section (disregarding the initial and the termination sections) and its typical representation is shown in
Viterbi decoding can be implemented based on a trellis representation for a code by eliminating codewords associated with all paths into each trellis state except the path with the maximum likelihood metric. The complexity of the decoder can be measured in the number of solid line branches (which correspond to additions in computing the metric) and the number of states that have two entering branches (which correspond to comparisons to decide which has the larger metric). The trellis in
The traditional trellis representation shown in
In an illustrative embodiment, the number of initial stages is reduced by one. This is associated with a corresponding decrease in the size of the largest node sets to 16, rather than 32 as would be the case for a traditional trellis. The number of intermediate stages (assuming median stages are included) is increased by one.
The complexity of the minimal trellis representation 200 is associated with the number of branches and states in the trellis. Although the complexity is reduced in the minimal trellis relative to a traditional trellis (thereby tending to reduce the complexity of a corresponding decoder), the behavior at various nodes is more variable. For example, the intermediate nodes include median nodes, as previously mentioned, and the final nodes include staggered final stage nodes, which have a different number of branches into and/or out of the node than, respectively, the other intermediate or final nodes.
As in
For the shortened Hamming (15, 10) code, we have found the permutation π that maps the codebook C to C* or, equivalently, maps the order of codeword bits in C to the order of the codeword bits in C*. This permutation of the 15 codeword bits r=[r1,r2,r3,r4,r5,r6,r7,r8,r9,r10,r11,r12,r13,r14,r15] corresponding to the minimal trellis is given by rπ=π(r)=[r1,r7,r9,r3,r10,r6,r11,r4,r5,r8,r14,r15,r12,r13,r2],
The permutation is used to significantly decrease the complexity of the optimal decoder. Specifically, soft or hard decoding of the shortened Hamming (15,10) code can be performed by first permuting the received matched filter outputs s≈r according to the permutation π, where s is the received codeword corresponding to the transmitted codeword r, and can be based on either hard or soft decisions of the matched filter output. Next the resulting vector sπ=π(s) is fed to the Viterbi (or other trellis search) decoder (operating on soft or hard inputs) corresponding to the minimal trellis in
The number of additions and comparisons using the minimal trellis operating on the permutation sπ is reported in the two last rows of Table 2. The decoders based on the full trellis and the decoder based on the minimal trellis may or may not have identical performance. By comparing Table 1 and Table 2 it is clear that the decoder based on the minimal trellis has 47% fewer additions and 50% fewer comparisons.
In the example of
In the example of
In a non-limiting embodiment, the mapping function 406 may be implemented in a computer readable medium such that a mapping engine could reorganize bits of a codeword for decoding by a minimal trellis as depicted in the diagram 400. The mapping function corresponds to the minimal trellis, and reorganizes codewords received that could be decoded by a traditional trellis.
While this invention has been described in terms of certain embodiments, it will be appreciated by those skilled in the art that certain modifications, permutations and equivalents thereof are within the inventive scope of the present invention. It is therefore intended that the following appended claims include all such modifications, permutations and equivalents as fall within the true spirit and scope of the present invention; the invention is limited only by the claims.
This application claims priority to U.S. Provisional Application 60/797,956, entitled Multimedia Cell Platform, filed May 4, 2006, which is incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5268695 | Dentinger et al. | Dec 1993 | A |
5729558 | Mobin | Mar 1998 | A |
6035007 | Khayrallah et al. | Mar 2000 | A |
6081700 | Salvi et al. | Jun 2000 | A |
6470047 | Kleinerman et al. | Oct 2002 | B1 |
6477208 | Huff | Nov 2002 | B1 |
6477213 | Miyoshi et al. | Nov 2002 | B1 |
6484285 | Dent | Nov 2002 | B1 |
6642904 | Yokoshima et al. | Nov 2003 | B2 |
6807404 | Meijer | Oct 2004 | B2 |
6871052 | Spencer et al. | Mar 2005 | B2 |
6967598 | Mills | Nov 2005 | B2 |
7035343 | Chi et al. | Apr 2006 | B2 |
7058422 | Learned et al. | Jun 2006 | B2 |
7076263 | Medvedev et al. | Jul 2006 | B2 |
7194237 | Sugar et al. | Mar 2007 | B2 |
7224743 | Holmes et al. | May 2007 | B2 |
7298798 | Chao et al. | Nov 2007 | B1 |
7321636 | Harel et al. | Jan 2008 | B2 |
7400872 | Kogure | Jul 2008 | B2 |
7415079 | Cameron et al. | Aug 2008 | B2 |
7450657 | Paulraj et al. | Nov 2008 | B2 |
7533327 | Heiman et al. | May 2009 | B2 |
7564931 | Venkataramani et al. | Jul 2009 | B2 |
7623836 | Finkelstein | Nov 2009 | B1 |
20020163879 | Li et al. | Nov 2002 | A1 |
20030003863 | Thielecke et al. | Jan 2003 | A1 |
20030081701 | Pick et al. | May 2003 | A1 |
20030087673 | Walton et al. | May 2003 | A1 |
20030141938 | Poklemba et al. | Jul 2003 | A1 |
20030157954 | Medvedev et al. | Aug 2003 | A1 |
20030185309 | Pautler et al. | Oct 2003 | A1 |
20040013209 | Zehavi et al. | Jan 2004 | A1 |
20040153679 | Fitton et al. | Aug 2004 | A1 |
20040234012 | Rooyen | Nov 2004 | A1 |
20040240486 | Venkatesh et al. | Dec 2004 | A1 |
20050053172 | Heikkila | Mar 2005 | A1 |
20050085269 | Buljore et al. | Apr 2005 | A1 |
20050099937 | Oh et al. | May 2005 | A1 |
20050113041 | Polley et al. | May 2005 | A1 |
20050130694 | Medvedev et al. | Jun 2005 | A1 |
20050170839 | Rinne et al. | Aug 2005 | A1 |
20050192019 | Kim et al. | Sep 2005 | A1 |
20050195784 | Freedman et al. | Sep 2005 | A1 |
20050220057 | Monsen | Oct 2005 | A1 |
20050245201 | Ella et al. | Nov 2005 | A1 |
20050265470 | Kishigami et al. | Dec 2005 | A1 |
20050276361 | Kim et al. | Dec 2005 | A1 |
20060034217 | Kwon et al. | Feb 2006 | A1 |
20060034221 | Karaoguz et al. | Feb 2006 | A1 |
20060083290 | Shin et al. | Apr 2006 | A1 |
20060094385 | Rafati | May 2006 | A1 |
20060223487 | Alam et al. | Oct 2006 | A1 |
20060270427 | Shida et al. | Nov 2006 | A1 |
20060276227 | Dravida | Dec 2006 | A1 |
20070136446 | Rezvani et al. | Jun 2007 | A1 |
20070153924 | Ling et al. | Jul 2007 | A1 |
20070202818 | Okamoto | Aug 2007 | A1 |
20070230638 | Griniasty | Oct 2007 | A1 |
20070258534 | Schmidt | Nov 2007 | A1 |
20080139123 | Lee et al. | Jun 2008 | A1 |
20080159123 | Tehrani et al. | Jul 2008 | A1 |
20100091891 | Calando et al. | Apr 2010 | A1 |
Number | Date | Country |
---|---|---|
WO-2007021159 | Feb 2007 | WO |
WO-2007130578 | Nov 2007 | WO |
Number | Date | Country | |
---|---|---|---|
20070283230 A1 | Dec 2007 | US |
Number | Date | Country | |
---|---|---|---|
60797956 | May 2006 | US |