For a better understanding of Markov Chains

85 Views Asked by At

I'm doing some exercises in probability on the markov chains, there is a task that I don't really understand the meaning of but I think it is very important to move on.

It is as follows:

Let $P$ be a stochastic matrix on $E$ (at most countable), $\mathcal{v}(x,A) = \sum_{y \in A}P(x,y)$ defines a Markov kernel from $E$ in $E$. Let $(X_n)$ be a markov chain of transition $P$. Note that $P_l : = P^l$ for $l \geq 1$ is a well defined stochastic matrix and that $(X_{nl})_{n \geq 1}$ is also a Markov chain with $P_l$ as its transition matrix.

Until here I understand everything even if I do not really understand why this is important.

Now let $X_0 \sim \mu$, show that $\forall f: E \to \mathbb{R}$, $$ \mathbb{E}_\mu(f(X_n)) = \mu P^n f, $$ where $\mu$ is a ligne vector and $f$ a column vector.

Already this I don't really understand, what's the purpose of it?

Show that $P$ has $1$ as eigenvalue.

If we have showed this. What can we conclude from this?

From here we suppose that $P$ is symmetric and $E$ finite, show that $\lambda_1 = 1$ is the largest eigenvalue. Suppose that $\lambda_2$ is the second largest eigenvalue, is $\lambda_2 < 1$.

I just want to make it clear that I have the solutions of most of it but I don't understand the meaning of the task although I think there is much more to it than just calculating some eigenvalues.

1

There are 1 best solutions below

0
On

Read the introduction of standard books on the subject, such as [1] or [2] or [3], to understand the motivation.

[1] James Robert Norris. Markov chains. Cambridge university press, 1998.

[2] Levin, David A., and Yuval Peres. Markov chains and mixing times. Vol. 107. American Mathematical Soc., 2017. https://yuvalperes.com/markov-chains-and-mixing-times-2/

[3] Brémaud, Pierre. Markov chains: Gibbs fields, Monte Carlo simulation, and queues. Vol. 31. Springer Science & Business Media, 2013.