I came across this problem in homework:
$X_n$ are i.i.d random variables with $\mathbb{P}[X_n=1]=\mathbb{P}[X_n=-1]=\frac{1}{2}$ and we we also have $S_n=X_1+...+X_n.$ Show that $S_n'=|S_n|$ is a Markov Chain.
I can understand it's indeed a Markov chain. And I know in this question, we should show $S_n$ is a Markov chain first and then $S_n$. But can anyone prove it purely by the definition of Markov Chain? (First how should we show $S_n$ is a Markov Chain?)
Definition: Let $(\Omega, F,\mathbb{P})$ be a probability space and $S$ a countable set. A sequence $X_1,X_2,...$ of $S$-valued random variable is called a Markov chain if for all $n\geq 1$and all $s_1,s_2,s_3,..,s_n\in S$, we have: $\mathbb{P}[X_n=s_n|X_{n-1}=s_{n-1},...,X_1=s_1]=\mathbb{P}[X_n=s_n|X_{n-1}=s_{n-1}]$
It might be a stupid question, but I will really appreciate if you can help me explain this. Thanks!
For the second part, it is harder to write down. Consider
$P(|S_n| = x | |S_{n-1}|=,...)$
Now, this thing is always 0, unless $x=|S_{n-1}|-1$ or $x=|S_{n-1}|+1$
If we are to write out this conditional expectation
$P(|S_n| = x | |S_{n-1}|,|S_{n-2}|...)= \dfrac{1}{2} \mathbb{1}_{\{x=|S_{n-1}|+1, |S_{n-1}|\neq 0\}}+\dfrac{1}{2} \mathbb{1}_{\{x=|S_{n-1}|-1, |S_{n-1}|\neq 0\}}+ \mathbb{1}_{\{x=1, |S_{n-1}|= 0\}}$
Well that is independent only dependent on $|S_{n-1}|$, i.e. $\sigma(|S_{n-1}|)$-measurable so $P(|S_n| = x | |S_{n-1}|,|S_{n-2}|...) = P(|S_n| = x | |S_{n-1}|)$
The difference between this and showing $S_n$ is markov is that $S_n$ is a random walk, i.e. the transition probability does not even depend on your current state. here, we lose that property.