Main design principle is to send a redundant message over the channel, which contains a codeword and a checkword(s). The latter is created by multiplying generator matrix with a source codeword, usually it is very big (thousands of bits), but matrix is very sparse, so number of operations is not very big. Generation matrices must obey some rules, main one is that it must be invertible, otherwise one row can be selected as linear combinations of others, and thus appropriate information bit will be lost.
Decoding is quite simple – the most powerful algorithm is to use belief propagation – each check node in generation graph (or generation matrix, which is just a different representation of the same entity) sends to codeword nodes what they belief a valid bit with calculated probability, after error probabily that given bit is zero or one becomes less than requested number (according to Shannon’s theorem code, which allows to reduce error probabily infinitely, always exist for given rate and communication capacity), codeword calcualtion (decoding) is completed. There are two mechanisms – hard and soft decision algos, the former is simpler, but the latter frequently is faster. Let me show an example of the hard decision algorithm (gotten from “LDPC Codes a brief Tutorial” by Bernhard M.J. Leiner, although description there is far from being perfect).
Let generation matrix to be this (not very sparse) set:
0 1 0 1 1 0 0 1 1 1 1 0 0 1 0 0 0 0 1 0 0 1 1 1 1 0 0 1 1 0 1 0
And source codeword is:
1 0 0 1 0 1 0 1
Check word, calculated my multiplication of matrix and codeword is:
0 0 0 0
Let’s during transmission of the codeword and check word over the channel codeword was changed to this (secod bit changed):
1 1 0 1 0 1 0 1
Here is a generation graph (originally proposed by Tanner) of the given matrix:
Here starts decoding algorithm, where each codeword node
C first sends its bit
to each check node
F0 node will receive 1 1 0 1 bits from C1, C3, C4 and C7 accordingly.
F1 node will receive 1 1 0 1 bits
F2 node will receive 0 1 0 1 bits
F3 node will receive 1 1 0 0 bits
Next step is to calculate the answer for each code node.
Received check word is 0 0 0 0 (calculated above), so set of simple equations starts here. Each check node F gets three out of four received bits and XORing (summing modulo 2, since this example works in Galois finite field of power of 1 – GF(1)), and sends to the codeword node a bit it expects to be correct to satisfy received check bit. Here is an example for first check node:
X0 ^ 1 ^ 0 ^ 1 = 0. X0 = 0 1 ^ X1 ^ 0 ^ 1 = 0. X1 = 0 1 ^ 1 ^ X2 ^ 1 = 0. X2 = 1 1 ^ 1 ^ 0 ^ X3 = 0. X3 = 0
Then we send Xi to Ci code bits. After all check nodes are processed, codeword nodes has following set of bits:
C0: 0 from F1, 1 from F3, 1 from originally received codeword.
C1: 0 from F0, 0 from F1, 1 from originally received codeword.
C2: 1 from F1, 0 from F2, 0 from originally received codeword.
C3: 0 from F0, 1 from F3, 1 from originally received codeword.
C4: 1 from F0, 0 from F3, 0 from originally received codeword.
C5: 0 from F1, 1 from F2, 1 from originally received codeword.
C6: 0 from F2, 0 from F3, 0 from originally received codeword.
C7: 1 from F0, 1 from F2, 1 from originally received codeword.
Then using the voting for each bit (i.e. which bit has more ‘votes’ in above table out of three cases), we get a new codeword:
1 0 0 1 0 1 0 1
The same steps then are repeated until cdeword stopped to change. In our case we get it after the first run.
Soft decision algorithm usses essentially the same logic, but it operates with probabilities of the bit to be 1 or 0, each probabilities are recalculated in each run, and after probability is higher than requested value (or error probability is less than requested value), loop stops.
Real world examples use much bigger codewords (up to several thousands of bits), but logic is always the same.
This is low-dencity parity check code iteractive decoding algorithm presentation. Original image was 50×50 bitmap ‘transferred’ over gaussian (in that degree, how glibc random number generator (
rand()) matches) channel, where 10% of the noise (swapped bits) were introduced. Rate of the ‘transmission’ is 0.5 (i.e. only half of the channel was used to transfer the data, and half for the parity bits). Code uses hard decoding algorithm, which stops after 4 iterations, since all errors are detected. Algorithm for matrix generation (takes most of the processing time) may be buggy, but it does not matter for this presentation.
In the original presentation by MacKay and Neal (1995), authors fully recovered 10000 bits image after ‘transmission’ over 7.5 % noise channel.
According to my study, LDPC codes and any other similar (let’s call them ‘probabilistic’) codes can not be used for any reliable transfer, since probability of the error detection is never equal to one. So, for some matrix some noise can be fixed, but different noise will not. There will be matrices which will fix at least number of errors, if I understood LDPC correctly, but its generation is generally very complex (my algorithm tries to automate that a bit though), and I failed to find a precise description of the error-recovery rate (i.e. how many bits can be 100% recovered by given matrix (set of weights and word/checksum sizes)). So, for guaranteed transmission some kind of combined algoritms must be used – LDPC or any other probabilistic code to detect some errors (as much as possible) and then provide given image to fixed blocksize decoder (like Reed-Solomon), which guarantees recovery of number of bits, specified during encoding time. Such combined algorithm will behave better than any of its parts (LDPC and RS separately), since because of probabilistic nature of the LDPC code, number of errors will be detected, so that error rate becomes small enough for RS decoder.
One can find my thoughts about why LDPC codes are not suitable for redundant or distributed data storages here.
One can find source code for LDPC encoder/decoder (
code.c), image to bitmap translator (
file2bit.c) and bitmap to picture viewer (
gen_images.c) in archive. Last two require GTK devlopment library.