Block Codes Definition

771 Views Asked by At

I'm studying Information Theory, specifically noisy channel coding, and I couldn't understand the definition of the (N, K) block codes.

Definition is as follows:

An (N, K) block code for a channel Q is a list of, S=$2^{k}$, codewords

{$x^{(1)}$,$x^{(2)}$,..., $x^{(2^{(K)})}$}, $x^{(s)} \epsilon > A_{X}^{N}$, each of length N. The number of codewords is integer but the number of bits specified by choosing a codeword, $K\equiv > log_{2}S$, is not necessarily an integer.

And rate is defined as

R=K/N

There is S number of codewords and each is the length of N. What does K mean here? What does K represent in the code?

Why is rate defined by K ? In my thinking rate of block code in a noisy channel should be number of bits carry information divided by length of the block. In that case if K is the number of bits carry information (non-redundant) what has K to do with the number of blocks, S=$2^{k}$?

Also in Wikipedia-block codes, K (referred as n in the wiki) is defined as the number of symbols in all block. That makes sense for me but don't hold with the definition of the book. Number of blocks is S in the definition and S=$2^{k}$ !

I couldn't answer this questions and really confused.

1

There are 1 best solutions below

0
On BEST ANSWER

in Wikipedia-block codes, $K$ (referred as $n$ in the wiki) is defined as the number of symbols in all block. That makes sense for me but don't hold with the definition of the book. Number of blocks is $S$ in the definition and $S=2^k$ !

No. Don't confuse the "number of symbols in a block" (= block lenght= codeword length) with the "number of blocks" (= number of messages = number of codewords = codebook size).

In a $(n,k)$ binary block code, $n$ is the block length (or number of bits in the codeword) and $k$ is the number of "information bits" or length of the "raw" (unencoded) binary message. Then, $S=2^k$ is the number of messages (or, equally, the number of codewords).

For example, a binary $(7,4)$ code has $S=2^4=16$ messages, which could be represented in raw binary encoding with 4 bits, and each of these messages is encoded in codeword (block) of length $7$. And the rate is $7/4$. The "number of symbols (here, bits) in a block" is 7 ($n$). The number of blocks (or messages) is 16.

The above definition generalizes this to codes that are possibly not binary, and where the number of messages is not necessarily a power of 2.