Entropy of Random Variable X, given Y and Z

508 Views Asked by At

I'm trying to obtain the mutual information between random variables X and Y,Z, as below:

$I(X;Y,Z) = H(X) - H(X | Y,Z)$

I know that,

$H(X) = \sum p(x,y)\log\left(\frac{p(x,y)}{p(y)}\right)$

and

$P(X|Y,Z) = \frac{P(x,y,z)}{P(y,z)}$

However, I don't know how to define H(X|Y,Z) above, to complete/compute $I(X;Y,Z) = H(X) - H(X | Y,Z)$

Any help appreciated.