How to calculate conditional entropy using using this tabular probability distribution?

1.1k Views Asked by At

Assuming variables $x$ and $y$ are independent, How can I calculate the conditional entropy $H(y \mid x=3)$ from the given probability distribution?

enter image description here

Since variables are independent we can easily calculate all $p_{ij} = p_x * p_y$. And then calculate joint entropy $\mathrm{H}(\mathrm{x}, \mathrm{y})$ using the formula:

$\mathrm{H}(X, Y)=-\sum_{x \in \mathcal{X}} \sum_{y \in \mathcal{Y}} \mathrm{P}(x, y) \log _{2} \mathrm{P}(x, y) = 2.5$

But I'm not sure how to calculate the conditional entropy $H(y \mid x=3)$

1

There are 1 best solutions below

0
On

Do the same reasoning but using the conditional distribution together with the bivariate one given by the table you posted.

The definition of conditional entropy is this

The conditional distribution $Y|X=3$ is

$$Y=\{1;2;3\}$$

with probability

$$\left\{\frac{3}{6};\frac{2}{6};\frac{1}{6}\right\}$$