Function of a Markov chain is a Markov chain

821 Views Asked by At

Given a Markov chain $X_n$ with it's states taking values in the set $S$. $f$ is a function, $f:S\rightarrow\mathbb{R}$. If a $f(X_n)$ is also a Markov chain, prove that either $f$ is injective or $f$ is constant. I'm totally clueless as to how I can proceed.

1

There are 1 best solutions below

0
On

The statement you're trying to prove is false.

For example, let $S =\{1,2,3\}$. Let $X_n$ be a Markov chain with transition matrix $$ \left[ \begin{array}{ccc} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{array} \right]. $$ Let $f(x) = 1$ if $x=1$, and $2$ if $x \in \{2,3\}$. Then $f$ is neither injective nor constant, but $f(X_{n})$ is a Markov chain with transition matrix $$ \left[ \begin{array}{cc} 1 & 0\\ 0 & 1\\ \end{array} \right]. $$

The above example was not irreducible. Here's an irreducible example: $$ \left[ \begin{array}{ccc} 1/3 & 1/3 & 1/3 \\ 1 & 0 & 0 \\ 1 & 0 & 0 \end{array} \right]. $$ We keep the same $f$. Then $f(X_{n})$ has transition matrix $$ \left[ \begin{array}{cc} 1/3 & 2/3\\ 1 & 0\\ \end{array} \right]. $$

Here's a more complicated example: $$ \left[ \begin{array}{ccc} 2/6 & 1/6 & 3/6 \\ 1/4 & 2/4 & 1/4 \\ 1/4 & 1/4 & 2/4 \end{array} \right]. $$ Then $f(X_{n})$ has transition matrix $$ \left[ \begin{array}{cc} 1/3 & 2/3\\ 1/4 & 3/4\\ \end{array} \right]. $$

As suggested in the second comment on the question, it's possible to merge states with the same outgoing probabilities, but in the last example, states 2 and 3 do not have the same outgoing probabilities. They only have the same outgoing probability to state 1.