Information Theory

BCH code. Bose–Chaudhuri–Hocquenghem codes.

1950’s, the age of contemporary mathematics. This was the age when the mathematics in information theory boomed. From Claude Shannon developing the notion of Information theory in 1949 to BCH code being independently invented by three scientists in 1959, the era did see some really significant inputs in that era. ref

Fun tangent, quick sort was invented by Tony Hoare in 1960. Boy the timeline is more non-linear than we think it is.

I can just keep going on various tangents, something like Alexis Hocquenghem‘s son, Guy Hocquenghem was someone who probably wrote first documented book about queer theory, like wow. Though I don’t believe that I have the luxury of time, right now I am going to focus back on BCH codes.

BCH codes are random error correcting cyclic codes. Phew! That was too many words. Most of linear and cyclic codes is covered in the previous article about the same.

  • The code works for multiple error correction.

For any integer m \ge 3 and t < 2^{m-1} there exists Binary BCH code with following parameters

  • Block length n = 2^m -1
  • Number of Parity-Check digits n-k \le mt
  • Minimum distance : d_{min} \ge 2t + 1

this is t-error-correcting BCH code. let’s take m = 3 so t = 4. We will have 4 error correcting BCH code.

  • Block length n = 7
  • Parity check digits n-k = 7-k \le 12
  • d_{min} \ge 9

In other words, α ∈ GF(q) is called a primitive element if it is a primitive (q − 1)th root of unity in GF(q); this means that each non-zero element of GF(q) can be written as αi for some integer i.

wikipedia

To understand BCH codes and Linear Block codes there needs to be some understanding of Finite Fields or Galois Fields. I do fan-girl a lot over Galois, the dude was amazing, but that I will leave it for some other day. I will be referring to MIT 6.451 Principles of Digital Communication II.

I also refer the resources for the Lecture.

“The problem is not to construct codes whose performance under maximum likelihood decoding is near that of the Shannon limit — you can use random codes for that. But in constructing these more algebraically structured codes, what we’re really trying to do is to facilitate the decoding problem, which is the central problem.”

Let’s say we have a generator polynomial g(x) of t-error-correcting BCH code of length n.

g(x) = LCM { \phi_1(x) , \phi_2(x) , ... \phi_{2t-1}(x) }

The degree of g(x) is at most mt.

Example:
GF(16) = GF( 2^4 ) = 1+ \alpha + \alpha^4 = 0

2 Comments

Leave a Reply

%d bloggers like this: