How to prove the Markov chain is positively correlated?

245 Views Asked by At

I am trying to prove a Markov chain is positively correlated. Let's say a basic Markov chain state 0 and state 1, with $p_{01}$ from state 0 to state 1, and $p_{11}$ from state 1 to itself ($p_{10}=1-p_{11}$ and $p_{00}=1-p_{01}$). Now the book say if $p_{11}>p_{01}$, it will be positively correlated, and I'd like to prove why is it. I use the equation $E[(X(t+1)-E[X(t+1)]) * (X(t)-E[X(t)])] = E[X(t+1)X(t)] - E[X(t+1)]E[x(t)] > 0$, but what is the next step to prove that this is correct?

1

There are 1 best solutions below

0
On

This is an interpretation, not a solution.

$p_{11} > p_{01}$ is equivalent to each of the inequalities

  • $p_{11} + p_{00} > 1$

  • $1 > p_{01}+p_{10}$

  • $p_{00} + p_{11} > p_{01}+p_{10}$

all of which say that (in some sense) the chain is more likely to stay at the current state than to change state.

If P(stay|X) > P(change|X) for every state X then of course we would get positive correlation of $X(t)$ and $X(t+1)$. The problem is saying that a weaker preference for "stay" over "change" is enough to make the correlation positive.

The statement is also false without clarifying how the chain is started (such as in the stationary distribution, or the uniform distribution). If positive correlation is claimed only for $t \to \infty$, that is the same as saying there is +correlation when started at the stationary distribution.