Sum of i.i.d. random variables is a markov chain

1.7k Views Asked by At

I think I have some problem understanding markov chains, because we defined them as abstract objects but our professor does proofs with them as if they where just elementary conditional probabilities.

This is our definition of a markov chain: Given prob. space $(\Omega, \mathcal{A}, \mathbb{P})$, standard borel space $(S, \mathcal{S})$ and a sequence of random variables $X_n: \Omega \to \mathcal{S}, n \in \mathbb{N}$. $(X_n)_{n\in \mathbb{N}}$ is called markov chain if $$\forall B \in \mathcal{S}: \mathbb{E}[\mathbb{1}_B(X_n)|\sigma(X_0, ..., X_{n-1})] = \mathbb{E}[\mathbb{1}_B(X_n) |\sigma(X_{n-1})]$$


So far, so good. But now we've got the following preposition:

Given $(\xi_i)_{i\in \mathbb{N}}$ iid rv on $\mathbb{R}^d$ and random variable $X_0$ independent of $(\xi_i)_{i \in \mathbb{N}}$ (also on $\mathbb{R}^d$) we define $X_n := X_0 + \sum_{i=1}^n \xi_i$. Then $(X_n)_{n \in \mathbb{N}_0}$ is a markov chain.

Proof: $$P(X_n \in B | X_0 = x_0, ..., X_{n-1} = x_{n-1}) = $$

$$= P(\xi_n + x_{n-1} \in B | X_0 = x_0, X_0 + \xi_1 = x_1, ..., X_0 + \sum_{i=1}^{n-1}\xi_i = x_{n-1}) = $$

$$=_{independce} P(\xi_n + x_{n-1} \in B) = ... = P(X_n \in B | X_{n-1} = x_{n-1})$$ Why do we start treating these conditional expectations just like elementary conditional probability for events?

Sorry for the awful formatting of the proof