What is the minimum number of bits required to correct all 2-bit errors?

I learned about the hamming code and how to use them to correct 1-bit errors and detect all 2-bit errors, but how can I expand this to fix 2 bits and possibly more?

What is the minimum number of bits required to correct all 2-bit errors?

+7
source share
1 answer

I think I get it.

N = number of data bits, k = bits with error correction (e.g. parity for interference)

In any ECC scheme, you have 2 ^ (N + k) possible bit strings.

For a single-bit error:

You must find k so that the total number of possible bit strings is greater than the possible number of strings with no more than 1-bit error for a given string.

The total possible lines with an error of no more than 1 bit are 2 ^ N (n + k + 1)

1 line without errors, N + k lines with 1-bit error

2 ^ (N + K)> = (2 ^ N) * (N + K + 1)

You just need to connect the k values ​​until you find one that satisfies the above (or, no matter how you solve it)

Similarly for a 2-bit error this is

1 line without errors, N + k lines with a 1-bit error, N + k selects 2 lines with a 2-bit error.

2 ^ (N + k)> = (2 ^ N) * (N + k + 1 + (N + k choose 2))

+6
source

All Articles