The Wire-tap channel II (http://link.springer.com/chapter/10.1007%2F3-540-39757-4_5) article in proof of Theorem 1 uses Fano's inequality to estimate the entropy $H(S|\hat{S}) \leq K \cdot h(P_e)$ where $S$ is the initial encoded word and $\hat{S}$ is the result of the decoder. In addition $$P_e = 1/K \cdot \sum_{i = 1}^K Pr(S_i \neq \hat{S}_i)$$ is the error rate of the decoder and $S_i$ denotes the respective bit of $S$. Both $S$ and $\hat{S}$ are $K$ bits long. Here, $h(x)$ denotes the binary entropy and $H(X|Y)$ denotes the conditional entropy. Fano's inequality states $$H(S|\hat{S}) \leq h(Pr[S \neq \hat{S}]) + Pr[S\neq \hat{S}]\cdot\log_2(|\mathcal{X}| - 1).$$
Could you explain how to derive from that the inquality $H(S|\hat{S}) \leq K \cdot h(P_e)$?
The derivation of this version of Fano's inequality can be found in appendix A of The Wire-Tap Channel by A. D. Wyner from 1975 in Bell System Technical Journal.
A direct link to a pdf