I have two observers (Y and Z) making a subjective evaluation. There is a relatively high rate of disagreement between the two, but over a "population" of observations it appears that opinions somewhat, but not entirely, balance each other out in ability to correctly identify label (which is disease or no disease, and we will call label = X). So: $I(X;Y) > 0$ and $I(X;Z)> 0$, but $I(X;Y) \ne I(X;Z)$
By chain rule of mutual information (2.5.3 in Cover) it appears that: I(X; Y,Z) = I(X; Z) + I(X; Y | Z) (first-- is this correct?) Assuming both Y and Z are not noise, nor completely redundant, then the conditioning variable Z improves upon information Y has with X. The question is if improvement by Y as the conditioning variable leads to the same result. Does
$$I(X; Y) + I(X; Z | Y) = I(X; Z) + I(X; Y | Z)$$
Would there be a simple proof if the above is true? Or am I off base? The reason for the question is that discordance in subjective observations is typically considered a bad thing (i.e. a low kappa statistic). It may be that if both observations have relevance then discordance can be a good thing-- a low kappa statistic may be a promising number. How much will one observer offer another and vice versa.
Thank-you leonbloy for your help. Based on your comment I found theorem 2.8.1 in Cover and Thomas 2nd edition page 34. From my medical background what this symmetry means is that two observers may be capable, but one observer (Y) might be considered "smarter", or of greater stature. But, the lesser (Z ="junior" chap) can be helpful (informative). This is radical in some circles, but supports some data I have. An example is possible value of "double reads" in the interpretation of mammograms for breast cancer (X= cancer). This will prompt a follow-on question. Again thanks.
By definition: $$\begin{align}\mathsf I(X; Y, Z) =~& \sum_{z\in\{0,1\}}\sum_{y\in\{0,1\}}\sum_{x\in\{0,1\}} \mathsf p_{X,Y,Z}(x,y,z)\log\dfrac{\mathsf p_{X,Y,Z}(x,y,z)}{\mathsf p_X(x)\,\mathsf p_Y(y)\,\mathsf p_Z(z)} \\[1ex] =~& \sum_{y\in\{0,1\}}\sum_{z\in\{0,1\}}\sum_{x\in\{0,1\}} \mathsf p_{X,Y,Z}(x,y,z)\log\dfrac{\mathsf p_{X,Y,Z}(x,y,z)}{\mathsf p_X(x)\,\mathsf p_Y(y)\,\mathsf p_Z(z)} \\[1ex] =~& \mathsf I(X; Z, Y) \end{align}$$