Show that $X_n := \sum_{i=1}^n Y_i$ is a Markov chain where $Y_n$ is an iid random variable on $\mathbb Z^k$

323 Views Asked by At

Given $(Y_n)_{n \in \mathbb N_0}$ is an iid random variable on $\mathbb Z^k$ with law $\mu$ I want to show that $X_n=\sum_i^n Y_i$ is a markov chain. My current solution is

$$\begin{align} P(X_{n+1}=j \, \vert \, X_n=x_n,...,X_1=x_1)&=P(\sum^{n+1}_i Y_i =j \, \vert \, \sum^{n}_i Y_i = y_n+..+y_1,..., Y_1=y_1)\\ &= P(\sum^{n+1}_i Y_i =j \, \vert \, \sum^{n}_i Y_i = y_n+...+y_1) \\ &= P(X_{n+1}=j \, \vert \, X_n=y_n+...+y_1)\end{align}$$ since all information is contained $\sum^{n}_i Y_i$ already, i.e. $X_{n+1}$ only depends on $X_n$. Is that correct? What would then the transition matrix initial distribution of $X_n$ be?

1

There are 1 best solutions below

0
On

You have to show that $(X_n)_n$ is a markov chain and therefore have to prove

$$ P(X_n=j|X_{n-1}=i_{n-1},..., X_1=i_1)=P(X_n=j| X_{n-1}=i_{n-1})\\ $$

For the transition matrix let us look at $p_{i,j}$, i.e.

$$\begin{align}P(X_n=j | X_{n-1}=i)&=P(X_n=j | \sum_{i=1}^{n-1}Y_i=i) \\ &=P(\sum_{i=1}^{n}Y_i=j|\sum_{i=1}^{n-1}Y_i=i) \\ &=P(\sum_{i=1}^{n-1}Y_i+Y_n=j|\sum_{i=1}^{n-1}Y_i=i)\\ &=P(i+Y_n=j) \\ &=P(Y_n=j-i)\end{align}$$

Now what is given in regards to the $(Y_n)_n$ and what is therefore the last probability?