How can I prove that $ p_{x,y}^{(n)}=P(X_n=y|X_0=x)$?

35 Views Asked by At

Let $(\Omega,\mathcal{A},P)$ a probability space.

Let $E$ be a countable set and $\Bbb P:=(p_{x,y})_{x,y\in E}$ a stochastic matrix (i.e. $p_{x,y}\ge0$ and $\sum_{y\in E}p_{x,y}=1$) and $\mu$ a probability on $E$. We know that then there exists a Markov Chain $(X_n)_{n\ge0}$ defined on $(\Omega,\mathcal{A},P)$ and it's such that $$ P(X_{k+1}=y|X_k=x)=p_{x,y}\;\;\forall x,y\in E,\;\;\forall k\ge0\;. $$

Let then $\Bbb P^n=(p_{x,y}^{(n)})_{x,y\in E}$ be the $n$-th power of the matrix $\Bbb P$.

We know that $$ p_{x,y}^{(n)}=\sum_{x_1,\dots,x_{n-1}\in E}p_{x,x_1}p_{x_1,x_2}\dots p_{x_{n-1}y}\;\;. $$

How can I prove that $$ p_{x,y}^{(n)}=P(X_n=y|X_0=x)\;\;\;? $$ I think it shouldn't be difficult, but I'm stuck.

I tried to see the case $n=2$, but after writing \begin{align*} p_{x,y}^{(2)} &=\sum_{z\in E}p_{x,z}p_{z,y}\\ &=\sum_{z\in E}P(X_1=z|X_0=x)P(X_2=y|X_1=z) \end{align*} I don't know how can I handle this.

Can someone help me? Many thanks

2

There are 2 best solutions below

1
On BEST ANSWER

For a formal derivation, you need three facts.

The first is the identity $$ P(A|B) = \frac{P(A,B)}{P(B)}. $$

The second a defining property Markov Chains, namely that $$ P(X_{n+1}=x_{n+1} \,|\, X_0=x_0,\,\ldots,\, X_n=x_n) = P(X_{n+1}=x_{n+1} \,|\, X_n=x_n), $$ i.e. that $X_n$ contains just as much information about $X_{n+1}$ as the whole history $X_0,\,\ldots,\,X_n$ does.

The third is that if $A = A_1 \cup \ldots \cup A_n$, and all the $A_i$ are disjoint, then $$ P(A) = P(A_1) + \ldots + P(A_n), $$ and the same holds if you add an arbitrary $B$ as a precondition, i.e. $$ P(A|B) = P(A_1\,|\,B) + \ldots + P(A_n\,|\,B). $$

Using the first two identities, you get $$\begin{eqnarray} P(X_2=y\,|\,X_1=z)P(X_1=z\,|\,X_0=x) &=& P(X_2=y\,|\,X_1=z,X_0=x)P(X_1=z\,|\,X_0=x) \\ &=& \frac{P(X_2=y,X_1=z,X_0=x)}{P(X_1=z,X_0=x)}\cdot\frac{P(X_1=z,X_0=x)}{P(X_0=x)} \\ &=& \frac{P(X_2=y,X_1=z,X_0=x)}{P(X_0=x)} \\ &=& P(X_2=y,X_1=z|X_0=x). \end{eqnarray}$$

The third identity then yields $$\begin{eqnarray} \sum_{z} P(X_2=y\,|\,X_1=z)P(X_1=z\,|\,X_0=x) &=& \sum_{z} P(X_2=y,X_1=z|X_0=x) \\ &=& P(X_2=y\,|\,X_0=x). \end{eqnarray}$$

0
On

Hint:

If $\lambda_0$ is a position vector, we know that $\lambda_1 = \lambda_0\mathbb{P}$. It follows that $\lambda_2 = \lambda_1\mathbb{P} = \lambda_0\mathbb{P}^2$

Induction can show that $\lambda_n = \lambda_0\mathbb{P}^n$