Prove for entropy of a source with m > 2 channel symbols.

85 Views Asked by At

I'm trying to prove that a source with $m>2$ symbols (source symbols), where a symbol has a probability of $\alpha\ll 1/m$ and the rest of the symbols have the same probability has a entropy of:

$$H(X)= \log_2(m-1)+\alpha \log_2(1/\alpha)$$

Please, any help will be accepted!!

2

There are 2 best solutions below

1
On

So $m-1$ of the symbols have probability $(1-\alpha)/(m-1)$, which is approximately $1/(m-1)$. These symbols contribute with the sum $$ -(m-1)\frac{1-\alpha}{m-1}\log_2\frac{1-\alpha}{m-1}, $$ which is then approximately equal to what?

3
On

Hint: There is well known and useful result: if you have two random variables $X_1$ and $X_2$ whose supports (alphabets) don't overlap, and you make a "mix" of them $X=X_1$ with prob. $p$, $X=X_2$ with prob. $1-p$, then the entropy of the mixing is $H(X)=p \, H(X_1)+(1-p)\,H(X_2) + h(p)$ where $h(\cdot)$ is the binary entropy function.

Hint 2: The formula you quote is not exact, it's an approximation that it's only valid for $\alpha \ll 1/m$, it should be $H(X) \approx \log_2(m-1)+\alpha \log_2(1/\alpha)$

Just write down the exact form of $H(X)$ (I gave one -just one- possible way to do this in my first hint), and see what happens to it when $\alpha \ll 1/m$