Probabilistic transition rule for two-level decoding of reed-solomon codes

Information

  • Patent Grant
  • 8171368
  • Patent Number
    8,171,368
  • Date Filed
    Monday, February 11, 2008
    16 years ago
  • Date Issued
    Tuesday, May 1, 2012
    12 years ago
Abstract
Decoding data is disclosed, including computing a syndrome polynomial for Reed-Solomon encoded data, applying the Berlekamp-Massey method to solve key equations associated with the syndrome polynomial, comparing a linear feedback shift register (LSFR) length LΛ associated with an error locator polynomial Λ(x) with an error correction capability t, and based on the comparison, determining whether to perform soft decision decoding or a Chien search.
Description
BACKGROUND OF THE INVENTION

Reed-Solomon codes are commonly used error correction codes. Their widespread applications include magnetic and optical data storage, wireline and wireless communications, and satellite communications. A Reed-Solomon code (n,k) over a finite field GF(q) satisfies n<q and achieves the maximally separable distance, i.e., d=n−k+1. The Berlekamp-Massey method efficiently decodes up to half minimum distance with complexity O(dn). The method can be clearly divided into three operation stages. The first stage performs syndrome computation, which takes n cycles. The second stage computes the error locator polynomial and the scratch polynomial, which takes d cycles. (In practice, the code rate is high and thus the minimum distance d is much smaller than the code length n.) The third stage performs Chien search and error evaluation, which costs n cycles. The total number of cycles is 2n+d. It would be desirable to reduce the latency of the whole decoding process while keeping both power consumption and decoding failure rate approximately the same. Such improvements would be particularly attractive for on-the-fly applications where a low latency is desirable.





BRIEF DESCRIPTION OF THE DRAWINGS

Various embodiments of the invention are disclosed in the following detailed description and the accompanying drawings.



FIG. 1 is a flow chart illustrating an example of a process for performing conventional two-level decoding.



FIG. 2 is a flow chart illustrating an embodiment of a process for performing two-level decoding using a probabilistic transition rule.



FIG. 3 is a block diagram illustrating an embodiment of a system for performing two-level decoding utilizing the proposed probabilistic transition rule.





DETAILED DESCRIPTION

The invention can be implemented in numerous ways, including as a process, an apparatus, a system, a composition of matter, a computer readable medium such as a computer readable storage medium or a computer network wherein program instructions are sent over optical or communication links. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. A component such as a processor or a memory described as being configured to perform a task includes both a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. In general, the order of the steps of disclosed processes may be altered within the scope of the invention.


A detailed description of one or more embodiments of the invention is provided below along with accompanying figures that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.



FIG. 1 is a flow chart illustrating an example of a process for performing conventional two-level decoding. In conventional two-level decoding, soft-decision decoding is performed only if hard-decision decoding fails.


Power and latency are two of the most important factors in on-the-fly applications, such as next generation data storage. In order to save power, conventionally, soft-decision decoding is performed only after hard-decision decoding is known to have failed, which typically means that if the Chien search fails, soft-decision decoding is performed, as shown in the example flowchart of FIG. 1. At 102, syndromes are computed.


Given Reed-Solomon C(n,k) code over GF(q), a codeword polynomial C(x) satisfies

Cm0+i)=0, i=0,1,2, . . . ,n−k−1.


The minimum Hamming distance of the code is d=n−k+1, a feature known as maximally-distance separable. The error correction capability is defined as






t


=
Δ







d
-
1

2



=





n
-
k

2



.






The hard-decision Berlekamp-Massey method is introduced next, which is the foundation of the two-level decoding regime.


Let C(x) denote the transmitted codeword polynomial and R(x) the received word polynomial after appropriate channel quantization. The decoding objective is to determine the error polynomial E(x) such that C(x)=R(x)−E(x).


The first step is computing syndrome values

Si=Ri+1)=Ci+1)+Ei+1)=Ei+1), i=0,1,2, . . . ,n−k−1.


Define the syndrome polynomial

S(x)=S0+S1x+S2x2+ . . . +Sn-k-1xn-k-1.  (1)


Define the error locator polynomial










Λ


(
x
)




=
Δ







i
=
1

e







(

1
-


X
i


x


)


=

1
+


Λ
1


x

+


Λ
2



x
2


+

+


Λ
e




x
e

.








(
2
)







Define the error evaluator polynomial










Ω


(
x
)




=
Δ







i
=
1

e








Y
i



X
i






j
=


1

j


i


e







(

1
-


X
j


x


)




=


Ω
0

+


Ω
1


x

+


Ω
2



x
2


+

+


Ω

e
-
1





x

e
-
1


.








(
3
)







The three polynomials satisfy the following key equation

Ω(x)=Λ(x)S(x)(mod xn-k).  (4)


Returning to FIG. 1, at 104, key equations are solved. In some embodiments, the key equations are solved using the Berlekamp-Massey method.


Pseudo code of the Berlekamp-Massey Method is given below:


Berlekamp-Massey Method

    • Input: S=[S0, S1, S2, . . . , Sn-k-1]
    • Initialization: Λ(0)(x)=1, B(0)(x)=1, and L=0
    • For r=0, 1, 2, . . . , n−k−1, do:
      • Compute Δr+1i=0LΛi(r)·Sr−i
      • Compute Λ(r+1)(x)=Λ(r)(x)−Δr+1·xB(r)(x)
      • If Δr+1≠0 and 2L≦r, then
        • Set B(r+1)(x)←Δr+1−1Λ(r)(x) and L←r+1−L
      • Else
        • Set B(r+1)(x)←xB(r)(x)
      • endif
    • endfor


Output: Λ(x) and B(x).


Note that in the above description, superscript “(r)” is used to stand for the r-th iteration and subscript “i” the i-th coefficient.


At 106, a Chien search is performed. With a given Λ(x) and B(x), the final stage is to perform the Chien search to identify all qualified roots and apply the Koetter-Horiguchi formula to determine the corresponding error magnitudes.


At 108, it is determined if the Chien search failed. The decoding is declared as a failure when the Chien search fails to find the same number of qualified roots as its degree.


If the Chien search did not fail, the process is done at 110. If the Chien search failed, soft decision decoding is performed at 112.


The Berlekamp-Massey method can be pipelined in three stages. The syndrome computation takes n cycles. The key-equation solver (i.e., the core of the Berlekamp-Massey method) takes d cycles (In practice, the code rate is high and thus the minimum distance d is much smaller than the code length n). Finally, the Chien search and error evaluation takes n cycles.


When soft-decision decoding is considered, it is conventional practice to perform two-level decoding, performing soft-decision decoding only if the hard-decision decoding fails. Since hard-decision decoding rarely fails, the extra power consumption due to switching to soft-decision decoding is negligible.


Moreover, the hard-decision decoding has much shorter latency than that of soft-decision decoding. Thus, the two-level decoding scheme effectively solves the issue of average long latency in some cases. However, for on-the-fly applications, it would be desirable to reduce the latency of the whole decoding process for all cases while compromising negligibly in power consumption and decoding failure rate.


Disclosed herein is a probabilistic transition rule such that soft-decision decoding is performed when the Berlekamp-Massey method fails at a preset rate that is significantly below the failure rate of the soft-decision decoding. The probabilistic transition rule allows soft-decision decoding to begin right after computing the error locator polynomial in the second stage while negligibly affecting power and performance.



FIG. 2 is a flow chart illustrating an embodiment of a process for performing two-level decoding using a probabilistic transition rule. In this case, a probabilistic transition rule is used to switch to soft-decision decoding without going through the Chien search compared to the conventional scheme where soft-decision decoding is started when the Berlekamp-Massey method fails absolutely, which can only be determined after the Chien search.


The following proposition characterizes an insight of the Berlekamp-Massey method.


Proposition 1 (i). If the degree of Λ(x), LΛ≦t, then, the discrepancies satisfy

Δr=0, r=2LΛ+1, 2LΛ+2, . . . ,n−k.  (5)


(ii). If LΛ≦t, then either Λ(x) is the genuine error locator polynomial corresponding to LΛ errors, or there are more than n−k−LΛ errors.


When it is not decodable (i.e., there are more than t errors), then it is plausible to assume that the possibility of Δr=0, r=2LΛ+1, 2LΛ+2, . . . , n−k, is roughly q−(n-k-2L), where q is the field size, following the fact that the value of Δr is very much random within GF(q). It is also assumed that the soft-decision decoding reduces the failure rate by a factor of fs on top of the hard-decision decoding. Following these two assumptions, the following probabilistic transition for two-level decoding is obtained:


If LΛ>t−δ, then switch to soft-decision decoding, where δ satisfies

q−(n-k-2t+2δ)<<fs,  (6)


otherwise, perform the Chien search and error correction, and terminate the decoding thereafter.


where


LΛ denotes the length of a linear feedback shift register (LFSR) of Λ(x)


t is the error correction capability


δ is tolerance limit


q is the finite field or Galois field size


The performance degradation due to the cases where more than t errors occur at the same time LΛ≦t−δ is negligible. On the other hand, in practical applications, δ<<t, therefore, the extra power consumption due to switching to soft-decision decoding when t−δ<LΛ t is also negligible.


In such embodiments, the decision of whether to perform the second level or soft decision decoding is based on the length of the register.


In the embodiment of FIG. 2, two-level decoding using the proposed probabilistic transition rule is performed as follows. At 202, syndromes are calculated. For example, syndromes are calculated as at 102. At 204, key equations are solved. For example, key equations are solved as at 104. At 206, a probabilistic transition rule is checked. In the embodiment shown, checking includes determining whether LΛ>t−δ. If LΛ>t−δ, then soft decision decoding is performed at 208. In this example, a Chien search is not performed if soft decision decoding is performed at 208. In this case, the likelihood that soft decision decoding will succeed is larger than the likelihood that hard decoding (Chien search) will succeed. If LΛ≦t−δ, then the Chien search is performed at 210. In this case, the likelihood that the Chien search will succeed is higher than the likelihood that soft decision decoding will succeed. At 212, it is determined whether the Chien search fails. If the Chien search does not fail, then the process is done at 214. If the Chien search does fail, the result is discarded at 216. In other embodiments, if the Chien search fails, the process could proceed to 208 and soft decision decoding could be performed.



FIG. 3 is a block diagram illustrating an embodiment of a system for performing two-level decoding utilizing the proposed probabilistic transition rule. For example, system 300 may be used to perform the process of FIG. 2. In the example shown, encoded data is received at syndrome computation block 302. Syndromes are provided as input to key equation solver 304 which, for example, performs the Berlekamp-Massey method. Probabilistic transition rule checker 306 receives Λ(x) and B(x), and determines whether to perform the Chien search or soft decision decoding. Depending on the decision, various control signal values are passed from the probabilistic transition rule checker 306 to Chien search block 308 and soft decision decoding block 310 to turn on one or the other. In some embodiments, block 302 performs step 202, block 304 performs step 204, block step 306 performs 206, block 308 performs step 210, and block 310 performs step 208.


Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.

Claims
  • 1. A method of decoding data, comprising: receiving Reed-Solomon encoded data at an input;computing a syndrome polynomial for the Reed-Solomon encoded data, wherein computing is performed by a processor;applying the Berlekamp-Massey method to solve key equations associated with the syndrome polynomial;comparing a linear feedback shift register (LFSR) length LΛ with an error correction capability t, wherein the LFSR is associated with an error locator polynomial Λ(x); andbased on the comparison, determining whether to perform soft decision decoding or a Chien search;in the event that the LFSR length LΛ is greater than a metric, performing soft decision decoding on the Reed-Solomon encoded data to produce decoded data; andproviding the decoded data as output.
  • 2. A method as recited in claim 1, wherein the metric is a difference between the error correction capability t and a tolerance δ.
  • 3. A method as recited in claim 2, further including in the event that the LFSR length LΛ is less than the metric, performing a Chien search.
  • 4. A method as recited in claim 1, further including in the event that the LFSR length LΛ is less than a difference between the error correction capability t and a tolerance δ, performing a Chien search.
  • 5. A method as recited in claim 1, wherein comparing and determining are performed before performing a Chien search.
  • 6. A method as recited in claim 1, wherein in the event soft decision decoding is performed, the Chien search is not performed.
  • 7. A method as recited in claim 1, wherein in the event the Chien search is performed and the Chien search fails, a result is discarded.
  • 8. A system for decoding data, comprising: a processor configured to:compute a syndrome polynomial for Reed-Solomon encoded data;apply the Berlekamp-Massey method to solve key equations associated with the syndrome polynomial;compare a linear feedback shift register (LFSR) length LΛ with an error correction capability t, wherein the LFSR is associated with an error locator polynomial Λ(x); andbased on the comparison, determine whether to perform soft decision decoding or a Chien search;a memory coupled with the processor, wherein the memory is configured to provide the processor with instructions.
  • 9. A system as recited in claim 8, wherein the processor is further configured to, in the event that the LFSR length LΛ is greater than a difference between the error correction capability t and a tolerance δ, perform soft decision decoding.
  • 10. A system as recited in claim 8, wherein the processor is further configured to, in the event that the LFSR length LΛ is less than a difference between the error correction capability t and a tolerance δ, perform a Chien search.
  • 11. A system as recited in claim 8, wherein comparing and determining are performed before performing a Chien search.
  • 12. A system as recited in claim 8, wherein in the event soft decision decoding is performed, the Chien search is not performed.
  • 13. A system as recited in claim 8, wherein in the event the Chien search is performed and the Chien search fails, a result is discarded.
  • 14. A computer program product for decoding data, the computer program product being embodied in a non-transitory computer readable medium and comprising computer instructions for: computing a syndrome polynomial for Reed-Solomon encoded data;applying the Berlekamp-Massey method to solve key equations associated with the syndrome polynomial;comparing a linear feedback shift register (LFSR) length LΛ with an error correction capability t, wherein the LFSR is associated with an error locator polynomial Λ(x); andbased on the comparison, determining whether to perform soft decision decoding or a Chien search.
  • 15. A computer program product as recited in claim 14, further comprising computer instructions for, in the event that the LFSR length LΛ is greater than a difference between the error correction capability t and a tolerance δ, performing soft decision decoding.
  • 16. A computer program product as recited in claim 14, further comprising computer instructions for, in the event that the LFSR length LΛ is less than a difference between the error correction capability t and a tolerance δ, performing a Chien search.
  • 17. A computer program product as recited in claim 14, wherein comparing and determining are performed before performing a Chien search.
  • 18. A computer program product as recited in claim 14, wherein in the event soft decision decoding is performed, the Chien search is not performed.
  • 19. A computer program product as recited in claim 14, wherein in the event the Chien search is performed and the Chien search fails, a result is discarded.
CROSS REFERENCE TO OTHER APPLICATIONS

This application claims priority to U.S. Provisional Patent Application No. 60/901,957 entitled A PROBABILISTIC TRANSITION RULE FOR TWO-LEVEL DECODING OF REED-SOLOMON CODES filed Feb. 16, 2007 which is incorporated herein by reference for all purposes.

US Referenced Citations (25)
Number Name Date Kind
4397022 Weng et al. Aug 1983 A
5537429 Inoue Jul 1996 A
5715258 Ikeda et al. Feb 1998 A
6092233 Yang Jul 2000 A
6134694 Uebayashi et al. Oct 2000 A
6360348 Yang Mar 2002 B1
6460160 Classon Oct 2002 B1
6487691 Katayama et al. Nov 2002 B1
6516443 Zook Feb 2003 B1
6634007 Koetter et al. Oct 2003 B1
7310767 Desai et al. Dec 2007 B2
7322004 Yu et al. Jan 2008 B1
7458007 Sankaran et al. Nov 2008 B2
7562282 Rothberg Jul 2009 B1
7653862 Hassner et al. Jan 2010 B2
7900122 Shen et al. Mar 2011 B2
7984366 Hsu Jul 2011 B2
20030093741 Argon et al. May 2003 A1
20030232601 Uno Dec 2003 A1
20040019842 Argon et al. Jan 2004 A1
20050166126 Muller et al. Jul 2005 A1
20060020869 Desai et al. Jan 2006 A1
20060059410 Santraine et al. Mar 2006 A1
20100174954 Karabed et al. Jul 2010 A1
20100199149 Weingarten et al. Aug 2010 A1
Provisional Applications (1)
Number Date Country
60901957 Feb 2007 US