Entropy of a single variable given mutual probability distribution of two variables

54 Views Asked by At

Let's say that we have a random variable $(X,Y)$ that can take in two four values. The respective probabilities are: $p(x,y) = Pr(X=x \land Y=y)$

$$p(0,0) = \frac 1 4$$ $$p(0,1) = \frac 3 8 $$ $$p(1,0) = \frac 1 8$$ $$p(1,1) = \frac 1 4 $$

Now, it is very easy to calculate the entropy $H(X,Y)$ straight from the definition.

However, let's say that I need to calculate $H(X)$. How do I attempt to do that? Will: $$H(X) = Pr(Y=0)H(X|Y=0) + Pr(Y=1)H(X|Y=1)$$

work?