Inifinite numbers of i.i.d Markov chain and law of large number.

39 Views Asked by At

Let $(X(t))_{t \geq 0}$ be a continuous time-homogeneous Markov chain with values in the state space $S$ that we assume to be finite (countable should also work). We denote $P = (p_{ij})_{ij}$ the matrix whose coefficients are the transition probability for $(i,j) \in S^2$ and $t,s \geq 0$ :

$$p_{ij}(t):= P(X_{s+t}=j, X_{s}=i),$$ which does not depend on $s$ as it is homogeneous and one can take $s=0$.

With good assumptions, for instance if the Markov chain is irreductible and admit a stationary distribution $\pi \in S$ (ie a vector satisfying $ \pi P = \pi$), one can know the limiting behavior of $p_{ij}(t)$ in the sense that

$$p_{ij}(t) \underset{t \rightarrow +\infty}{\longrightarrow} \pi_{j}$$ for any $(i,j) \in S^2$.

Now consider a family $\{ (X^M(t))_{t \geq 0}, \quad M = 1 \dots N \ \}$ made of N such Markov chain which are independent and identically distributed. All the Markov chains start in the same position i.e there is a $i_0 \in S$ such that $X^M(0)=i_0$ for all $M$ (though I don't think this is even necessary).

I would like to know more about the following statement: given any positive time $t_0>0$, we have the following convergence for any $(i,j) \in S^2$:

$$\frac{1}{N}\sum_{M=1}^N p_{ij}^M(t_0) := \frac{1}{N}\sum_{M=1}^N P(X_{t_0}^M=j, X_{0}^M=i) \underset{N \rightarrow +\infty}{\longrightarrow} \pi_j$$ In a way, this result might be related to ergodic theory since we exchanged the time average with the probability one.

My questions are the following: does the setting of the question looks clear to you and do you think such a result hold ? (with maybe modified assumptions though) I am taking gladly any thoughts, remarks or references on the subject !