I'm just working through some information theory and entropy, and I've come into a bit of a problem.
In many texts, it's easy to find the "chain rule" for entropy in two variables, and the "conditional chain rule" for three variables, respectively; $$H(Y|X) = H(X,Y) - H(X)$$ $$H(X,Y|Z) = H(Y|Z) + H(X|Y,Z) = H(X|Z) + H(Y|X,Z)$$
However, I'm trying to determine the entropy of three random variables: $H(X,Y,Z)$. I haven't done a lot of probability/statistics before, and googling hasn't really turned up anything too fruitful.
Can anyone help me derive this result??
You can combine the "conditional chain rule" and the "chain rule" to extend the joint entropy from two to three variables in a variety of ways, as follows:- $$\begin{align}H(X,Y,Z) &= H(X|Y,Z) + \color{blue}{H(Y,Z)}\\&=\color{red}{H(X|Y,Z)} + \color{blue}{H(Y|Z)+H(Z)}\\&=\color{red}{H(X,Y|Z)-H(Y|Z)}+H(Y|Z)+H(Z)\\&=H(X,Y|Z)+H(Z)\end{align}$$