Based on https://en.wikipedia.org/wiki/Conditional_mutual_information: Conditional mutual information is the expected value of the mutual information of two random variables given the value of a third.
My question is that is there any similar measure to evaluate the mutual information of one variable when it is conditioned on two other random variables. For example, given the random variables $X$, $Y$ and $Z$, what is the mutual information between $P(X\mid Y)$ and $P(X\mid Z)$. Based on the notation on the above reference, I am looking for $I(X\mid Y;Z)$.