I am trying to solve the exercise below and I am very confused.
Exercise: Given three random variables with joint probability distribution function: $$p(x, y, z)=p(x)p(y|x)p(z|x)$$ with
$ X: p(X=0)= p(X=1)= 1/4, \ p(X=2)=1/2 ,$ $ Y: p(Y=0| X=0)= p(Y=1| X=1)= 1, \ p(Y=0| X=2) = p(Y=1|X=2)=1/2 ,$ $ Z: p(Z=0|X=0)= p(Z=0|X=1)=p(Z=1|X=2)=1.$
Find $H(X),\ H(Y), \ H(Z).$
For $H(X)$:
$H(X)=-\sum_{i=1}^3p(x_i)log_2(p(x_i))=3/2$.
I would like your help to find $H(Y)$ and try the rest by myself.
EDIT Ι followed your suggestions and I found the solution but I would like a second opinion. So here is my solution.
$H(X)=-∑_{i=1}^3p_X (x_i ) log_2(p_X (x_i )) = -P(X=0) log_2(P(X=0))-P(X=1) log_2(P(X=1))-P(X=2) log_2(P(X=2)) =-1/4 log_2(1/4)-1/4 log_2(1/4)-1/2 log_2(1/2) =-3/2=1,5 bits.$
$H(Y)=-∑_{i=1}^2p_Y (y_i ) log_2(p_Y (y_i )) = -P(Y=0) log_2(P(Y=0))-P(Y=1)log_2(P(Y=1)) =-1/2 log_2(1/2)-1/2 log_2(1/2)=1 bit. $
$H(Z)=-∑_{i=1}^2p_Z (z_i )log_2(p_Z (z_i )) = -P(Ζ=0)log_2(P(Ζ=0))-P(Ζ=1) log_2(P(Ζ=1)) =-1/2 log_2(1/2)-1/4 log_2(1/4) =1 bits.$
Once you compute the probability, you should be able to compute the entropy.
$$P(Y=y_i) = \sum_{x=0}^2P(Y=y_i|X=x)P(X=x)$$