When does entropy equals the optimal code expected length?

465 Views Asked by At

In information theory, we know that the entropy is a lower bound for the optimal code expected length. But when does it hold with equality?

Also, is it possible to determine if they are equal by looking at the binary tree formed by the code?

2

There are 2 best solutions below

0
On

If the distribution is $D-$adic, i.e., $p_i=D^{-\ell_i}$ where $D$ is the alphabet size ($D=2$ is the binary alphabet) then Kraft inequality is satisfied with equality.

0
On

In Cover & Thomas textbook it's proved (Theorem 5.3.1) that the average length $L$ of a $D-$ary instantaneous code satisfies

$$L \ge H_D$$

where $H$ is the entropy rate (in base $D$) of the source. And the inequality is attained if and only if $p_i = D^{-l_i}$, i.e., if all the symbol probabilities have the form $(1/D)^{k}$ for integer $k$.

The result can be generalized for uniquely decodable codes.