What is the relationship between mutual information conditioned on different variables: I(W;X|Y) vs. I(W;X|Z)

38 Views Asked by At

Let four random variables form the Markov chain

$${\displaystyle \raise{1.5ex}W\overset{\longleftarrow}{\searrow\lower{1.5ex} X\swarrow}\raise{1.5ex} Y\searrow\lower{1.5ex} Z}$$

such that the conditional distribution of $X$ depends only on $W$ and $Y$ and is conditionally independent of $Z$ given $Y$. What is the relationship between the mutual informations I(W;X|Y) and I(W;X|Z) (e.g. is one greater than or equal to the other)?

EDIT: $W$ also depends on $Y, Y\rightarrow W$, (not shown in Markov chain graph above).

1

There are 1 best solutions below

0
On

Your DAG encodes the factorisation: $p(w,x,y,z)= p(y)~p(w\mid y)~p(x\mid w,y)~p(z\mid y)$

$$\begin{align}\mathcal I(W,X\mid Y) &= \sum_{w,x,y} p(y) p(x,w\mid y)\log\left(\dfrac{p(w,x\mid y)}{p(w\mid y)p(x\mid y)}\right)\\[1ex] &= \sum_{w,x,y} p(y)p(w\mid y)p(x\mid w,y)\log\left(\dfrac{p(x\mid w,y)}{p(x\mid y)~~~~~}\right)\\[3ex]\mathcal I(W,X\mid Z) &= \sum_{w,x,z} p(z)p(w,x\mid z)\log\left(\dfrac{p(w,x\mid z)}{p(w\mid z)p(x\mid z)}\right)\\[1ex] &~~\vdots \end{align}$$