Proving that the Markov chain is recurrent - Confusion/Help

1k Views Asked by At

Giving the following transition matrix

[ 0.9 0.1 ]

[ 0.8 .2 ]

Classify the states

From drawing the graph I know that both stats are recurrent. However I'm really failing to prove mathematically that they are when using this equation.

From Wiki

It can be shown that a state i is recurrent if and only if the expected number of visits to this state is infinite, i.e.,

Here is how I calculate it (for State 1)

In 1 step --> 0.9

In 2 steps --> 0.1*0.8 = 0.02

In 3 steps --> 0.1*0.2*0.8 = 0.016

But im not sure how this is proving that is lead to infinity.

The example im basing it on is that eventually summing the steps will lead to having a value that is >=1 and then we say that as we sum over infinity it will lead to infinity, thus recurrent.

Note: we have to prove it using the above formula, if it is transient we should prove that the result is less then infinity when calculating the summation from 0--> infinity

Here is an example im basing it on

[.5 .5]

[ 1 0]

here is what we did (for state 1 )

In 1 step --> 0.5

In 2 steps --> 1*.5

so overall we get .5+.5=1 and the summation as we sum to infinity is infinity

Any help is appreciated

1

There are 1 best solutions below

0
On

A state $i$ is recurrent if and only if $$ \sum\limits_{n=1}^{+\infty} p_{i,i}(n) = +\infty $$ where $p_{i,i}(n) = \mathbb{P}\{X_{m + n} = i \mid X_m = i\}$, i.e. the probability of being back in state $i$ in $n$ steps after starting from state $i$.

As Did pointed out in his comment, the value $p_{i,i}(n)$ converges to a positive value, so the aforementioned sum diverges, and hence $i$ is recurrent.