Markov Chain for Conditional Entropy

284 Views Asked by At

These informations are given. I need to prove that $X \rightarrow Y \rightarrow Z$ is a Markov chain, then so is $Z \rightarrow Y \rightarrow X$ is a Markov chain

I tried to find some answers;

$Z \rightarrow Y \rightarrow X$

$H(Z) \ge H(Z:Y) \ge H(Z:X)$

$H(Z:Y,X) = H(Z:X)+H(Z:Y)$

$H(Z:Y) +H$

As you can see, I could not find any reasonable answer. By the way, this is not a homework. It is a question from my worksheet that is given to study for final examination. Any help would be appreciated.

1

There are 1 best solutions below

1
On BEST ANSWER

By the Markov chain definition, $X\rightarrow Y \rightarrow Z$ if and only if

$$ \tag{1} p(x,y,z)=p(x)p(y|x)p(z|y). $$

By the standard chain rule, it also holds

$$ \tag{2} p(x,y,z)=p(z)p(y|z)p(x|y,z). $$

Now note that

$$ \begin{align} p(x|y,z)&=\frac{p(x,y,z)}{p(y)p(z|y)}\\ &\stackrel{(a)}{=}\frac{p(x)p(y|x)p(z|y)}{p(y)p(z|y)}\\ &=\frac{p(x)p(y|x)}{p(y)}\\ &=p(x|y),\tag{3} \end{align} $$

where $(a)$ follows from $(1)$. Substituting $(3)$ in $(2)$ results in

$$ p(x,y,z)=p(z)p(y|z)p(x|y), $$

that is, it also holds $Z\rightarrow Y \rightarrow X$.

Another way to show this is to note that $X\rightarrow Y \rightarrow Z$ if and only if

$$ \tag{4} p(x,z|y)=p(x|y)p(z|y), $$

i.e., the "first" and "last" variables are independent when conditioned on the "middle" variable. Since the "first" and "last" labels can be arbitrarily selected (either $X$, $Z$ or $Z$, $X$, respectively), it follows that condition $(4)$ also corresponds to $Z\rightarrow Y \rightarrow X$.