Entropy for three random variables

10k Views Asked by At

I'm just working through some information theory and entropy, and I've come into a bit of a problem.

In many texts, it's easy to find the "chain rule" for entropy in two variables, and the "conditional chain rule" for three variables, respectively; $$H(Y|X) = H(X,Y) - H(X)$$ $$H(X,Y|Z) = H(Y|Z) + H(X|Y,Z) = H(X|Z) + H(Y|X,Z)$$

However, I'm trying to determine the entropy of three random variables: $H(X,Y,Z)$. I haven't done a lot of probability/statistics before, and googling hasn't really turned up anything too fruitful.

Can anyone help me derive this result??

2

There are 2 best solutions below

1
On BEST ANSWER

You can combine the "conditional chain rule" and the "chain rule" to extend the joint entropy from two to three variables in a variety of ways, as follows:- $$\begin{align}H(X,Y,Z) &= H(X|Y,Z) + \color{blue}{H(Y,Z)}\\&=\color{red}{H(X|Y,Z)} + \color{blue}{H(Y|Z)+H(Z)}\\&=\color{red}{H(X,Y|Z)-H(Y|Z)}+H(Y|Z)+H(Z)\\&=H(X,Y|Z)+H(Z)\end{align}$$

0
On

Exactly as ther other answer mentions, one can expand (recursively) the joint entropy of $n$ variables to the joint and conditional entropy of $n-1$ variables and so on.

Similarly to the way the joint probability of $n$ variables can be reduced to the computation of the joint and conditional probability of $n-1$ variables (by the definition of conditional probability i.e $P(A|B) = P(AB)/P(B)$)

(just think that entropy is the logarithm of probability and you get one formula from the other)