Prove Markov Chain by definition

23.8k Views Asked by At

I came across this problem in homework:

$X_n$ are i.i.d random variables with $\mathbb{P}[X_n=1]=\mathbb{P}[X_n=-1]=\frac{1}{2}$ and we we also have $S_n=X_1+...+X_n.$ Show that $S_n'=|S_n|$ is a Markov Chain.

I can understand it's indeed a Markov chain. And I know in this question, we should show $S_n$ is a Markov chain first and then $S_n$. But can anyone prove it purely by the definition of Markov Chain? (First how should we show $S_n$ is a Markov Chain?)

Definition: Let $(\Omega, F,\mathbb{P})$ be a probability space and $S$ a countable set. A sequence $X_1,X_2,...$ of $S$-valued random variable is called a Markov chain if for all $n\geq 1$and all $s_1,s_2,s_3,..,s_n\in S$, we have: $\mathbb{P}[X_n=s_n|X_{n-1}=s_{n-1},...,X_1=s_1]=\mathbb{P}[X_n=s_n|X_{n-1}=s_{n-1}]$

It might be a stupid question, but I will really appreciate if you can help me explain this. Thanks!

2

There are 2 best solutions below

10
On BEST ANSWER

For the second part, it is harder to write down. Consider

$P(|S_n| = x | |S_{n-1}|=,...)$

Now, this thing is always 0, unless $x=|S_{n-1}|-1$ or $x=|S_{n-1}|+1$

If we are to write out this conditional expectation

$P(|S_n| = x | |S_{n-1}|,|S_{n-2}|...)= \dfrac{1}{2} \mathbb{1}_{\{x=|S_{n-1}|+1, |S_{n-1}|\neq 0\}}+\dfrac{1}{2} \mathbb{1}_{\{x=|S_{n-1}|-1, |S_{n-1}|\neq 0\}}+ \mathbb{1}_{\{x=1, |S_{n-1}|= 0\}}$

Well that is independent only dependent on $|S_{n-1}|$, i.e. $\sigma(|S_{n-1}|)$-measurable so $P(|S_n| = x | |S_{n-1}|,|S_{n-2}|...) = P(|S_n| = x | |S_{n-1}|)$

The difference between this and showing $S_n$ is markov is that $S_n$ is a random walk, i.e. the transition probability does not even depend on your current state. here, we lose that property.

5
On

To show $S_n$ is a Markov chain, you need to show that $$P(S_n=x|S_1,\ldots,S_{n-1})=P(S_n=x|S_{n-1}).$$

In other words, to determine the transition probability to $S_n$, all you need is $S_{n-1}$ even if you are given the entire past. To do this, write $S_n=S_{n-1}+X_n$. Conditioning on $S_{n-1}$ fixes it, so the only random thing here is $X_n$, which is implicitely independent of $S_{n-1},S_{n-2},\ldots,S_1$. To put it in the form of math, $P(X_n|S_{n-1},\ldots,S_1)=P(X_n)=P(X_n|S_{n-1})$. So if you are studying $P(S_n=x|S_{n-1},\ldots,S_1)$, it's equivalent to write $$P(X_n=x-S_{n-1}|S_{n-1},\ldots,S_1)=P(X_n=x-S_{n-1}|S_{n-1})=P(S_n=x|S_{n-1}).$$