I have been reading a bit about conditional entropy, joint entropy, etc but I found this: $H(X|Y,Z)$ which seems to imply the entropy associated to $X$ given $Y$ and $Z$ (although I'm not sure how to describe it). Is it the amount of uncertainty of $X$ given that I know $Y$ and $Z$? Anyway, I'd like to know how to calculate it. I thought this expression means the following:
$$H(X|Y,Z) = -\sum p(x,y,z)log_{2}p(x|y,z)$$
and assuming that $p(x|y,z)$ means $\displaystyle \frac{p(x,y,z)}{p(y)p(z)}$, then \begin{align} p(x|y,z)&=\displaystyle \frac{p(x,y,z)}{p(x,y)p(z)}\frac{p(x,y)}{p(y)}\\&=\displaystyle \frac{p(x,y,z)}{p(x,y)p(z)}p(x|y) \\&=\displaystyle \frac{p(x,y,z)}{p(x,y)p(x,z)}\frac{p(x,z)}{p(z)}p(x|y)\\&=\displaystyle \frac{p(x,y,z)}{p(x,y)p(x,z)}p(x|z)p(x|y) \end{align} but that doesn't really help.
Basically I wanted to get a nice identity such as $H(X|Y)=H(X,Y)-H(Y)$ for the case of two random variables.
Any help?
Thanks
$$H(X\mid Y,Z)=H(X,Y,Z)-H(Y,Z)=H(X,Y,Z)-H(Y\mid Z)-H(Z)$$ Edit: Since $\log p(x\mid y,z)=\log p(x,y,z)-\log p(y,z)$, $$ H(X\mid Y,Z)=-\sum\limits_{x,y,z}p(x,y,z)\log p(x,y,z)+\sum\limits_{y,z}\left(\sum\limits_{x}p(x,y,z)\right)\cdot\log p(y,z). $$ Each sum between parenthesis being $p(y,z)$, this proves the first identity above.