Let $p_{x y}^{(n)}=\mathbb{P}\left(X_{n}=y \mid X_{0}=x\right)$ be the $n$-step transition probabilities of an arbitrary Markov chain on a state space

125 Views Asked by At

my friends and I are working through questions and we came across this question where I had a different approach. My friends used the Chapman Kolmogorov equation which I don't understand how. But this is how I approached it:

Question

Let $p_{x y}^{(n)}=\mathbb{P}\left(X_{n}=y \mid X_{0}=x\right)$ be the $n$-step transition probabilities of an arbitrary Markov chain on a state space $S$. Suppose that $$ \forall y \in S, \sum_{x} p_{x y}=1 $$ where $p_{x y}=p_{x y}^{(1)}$. Show that for any integer $n \geq 0$ $$ \forall y \in S, \sum_{x} p_{x y}^{(n)}=1 $$

My attempt

I used the markov process proposition where it states that: For a Markov Process, $\mathbb{P}\left(x_{0}=x_{0}, x_{1}=x_{1}, \ldots,\right.,\left.x_{n}=x_{n}\right)=\mathbb{P}\left(x_{0}=x_{0}\right) \mathbb{P}\left(x_{1}=x_{1} \mid x_{0}=x_{0}\right) \ldots \cdot\mathbb{P}\left(x_{n}=x_{n} \mid x_{n-1}=x_{n-1}\right) $

The markov property states that the future is conditionally dependent on the present. Hence I get:

Hence, $\mathbb{P}( X_{n}=Y \mid X_{0}=x_0)=\frac{P(X_{0}=x_{0} \ldots X_{n}=Y)}{P(X_{0}=x_{0} \cdot X_{n-1}=Y_{n-1})} = P_{x {y-1}}$

And as the question states $\forall y \in S, \sum_{x} p_{x y}=1$ thus this implies $ \sum_{x} p_{x y-1}=1$ hence proven.

Am I right?

1

There are 1 best solutions below

0
On BEST ANSWER

We can prove this fact by induction with the Chapman-Kolmogorov equation.

CK: For all $0<k<n$, we have $p_{xy}^{(n)}=\sum_{z}p_{xz}^{(k)}\cdot p_{zy}^{(n-k)}$.


For $n=0$, the statement is trivial, since $p_{xy}^{(0)}=\delta_{xy}=\begin{cases}1,\quad x=y,\\ 0, \quad else.\end{cases}$. For all $y$, $\sum_x p_{xy}^{(0)}=1$. For $n=1$, this holds as well. You explicitly have it in your assumptions.

Now assume, that the statement already holds for all times $0,1,...,n-1$.

We know by CK, that $p_{xy}^{(n)}=\sum_{z}p_{xz}^{(1)}\cdot p_{zy}^{(n-1)}$. So fix $y$ and sum both sides over $x$: \begin{align} \sum_{x} p_{xy}^{(n)} &= \sum_{x}\sum_{z}p_{xz}^{(1)}\cdot p_{zy}^{(n-1)}\\ &=\sum_z \sum_x p_{xz}^{(1)}\cdot p_{zy}^{(n-1)}\\ &= \sum_z p_{zy}^{(n-1)} \cdot \sum_x p_{xz}^{(1)} \\ &= \sum_z p_{zy}^{(n-1)} \cdot 1\\ &= 1. \end{align} We used the induction hypothesis in the second to last equation for the time $1$ and again in the last equation for the time $n-1$.