Conditional entropy upper bound between two discrete random variables.

339 Views Asked by At

Let's suppose X and Y two discrete random variables that can assume the number of states $k_x$ and $k_y$ respectively. Is possible to determine the maximum conditional entropy achievable $H_{max}[Y \mid X]$ or is necessary achieve an approximation?

1

There are 1 best solutions below

0
On BEST ANSWER

Conditioning can’t increase entropy, so the following always holds: $H(Y|X) \leq H(Y)$. Unless I’m misunderstanding the question, we can simply choose $X$ and $Y$ to be independent to have $H(Y|X) = H(Y)$.

Another principle is that for an alphabet size of $k_y$, $H(Y) \leq \log_2(k_y)$, where the equality is achieved by the uniform distribution. Then, choosing $Y$ to be the uniform distribution over its size $k_y$ alphabet, we get the upper bound .