Following Calculating conditional entropy given two random variables
I would like to figure out how to extend the entropy to several given variables: For two variables: $H(X|Y,Z) = -\sum p(x,y,z)log_{2}p(x|y,z)$ $H(X\mid Y,Z)=H(X,Y,Z)-H(Y,Z)=H(X,Y,Z)-H(Y\mid Z)-H(Z)$
Now if I have $H(X|Y,Z, A, B)$ I can write: $H(X|Y,Z, A, B) = -\sum p(x,y,z, a, b)log_{2}p(x|y,z, a, b)$ $= -\sum p(x,y,z, a, b)log_{2}p(y\mid z, a, b)p(x|y,z, a, b)p(z\mid a, b)p(a\mid b)p(b)$
Can I develop this in a simpler way?
and: $H(X\mid Y,Z,A,B)=H(X,Y,Z,A,B)-H(Y,Z, A, B)=H(X,Y,Z,A,B)-H(Y\mid Z,A,B)-H(Z\mid A,B)-H(A\mid B) - H(B)$
Am I right?
Thank you for your answers! B.
Entropy satisfies the following easily checked "chain rule": $H(X,Y) = H(X) + H(Y|X)$
So, one way to develop your identity is to treat $(Y,Z,A,B)$ as a new random variable, say $\Theta$. Then, we have $H(X|\Theta) = H(X,\Theta) - H(\Theta) = H(X,Y,Z,A,B) - H(Y,Z,A,B)$. Now one can expand $H(Y,Z,A,B)$ again using the chain rule to get the expression you have.