Fano's inequality and error rate

379 Views Asked by At

The Wire-tap channel II (http://link.springer.com/chapter/10.1007%2F3-540-39757-4_5) article in proof of Theorem 1 uses Fano's inequality to estimate the entropy $H(S|\hat{S}) \leq K \cdot h(P_e)$ where $S$ is the initial encoded word and $\hat{S}$ is the result of the decoder. In addition $$P_e = 1/K \cdot \sum_{i = 1}^K Pr(S_i \neq \hat{S}_i)$$ is the error rate of the decoder and $S_i$ denotes the respective bit of $S$. Both $S$ and $\hat{S}$ are $K$ bits long. Here, $h(x)$ denotes the binary entropy and $H(X|Y)$ denotes the conditional entropy. Fano's inequality states $$H(S|\hat{S}) \leq h(Pr[S \neq \hat{S}]) + Pr[S\neq \hat{S}]\cdot\log_2(|\mathcal{X}| - 1).$$

Could you explain how to derive from that the inquality $H(S|\hat{S}) \leq K \cdot h(P_e)$?

2

There are 2 best solutions below

0
On BEST ANSWER

The derivation of this version of Fano's inequality can be found in appendix A of The Wire-Tap Channel by A. D. Wyner from 1975 in Bell System Technical Journal.

A direct link to a pdf

1
On

Let $ P_E = Pr( S \ne \hat{S})$. Then Fano says

$$H(S|\hat{S}) \leq h(P_E) + P_E \log_2(|\mathcal{X}| - 1)$$

But $P_E \le \sum P(S_i \ne \hat{S_i})= K P_e$ . Furthermore: $|\mathcal{X}|\le 2^K$

Then (assuming we are in the region $K P_e < 1/2$, where $h(\dot)$ is increasing) we get

$$H(S|\hat{S}) \leq h(K P_e) + K P_e \log_2(|\mathcal{X}| - 1) \leq h(K P_e) + K^2 P_e $$

Because of concavity of the binary entropy function, $h(K P_e) \le K h(P_e)$ and

$$H(S|\hat{S}) \leq K h(P_e) + K^2 P_e $$

This is not the desired bound. However, if $P_e \ll 2^{-K}\ll 1$ the second term can be negligible. (This is not a riguorous bound, however.)