Markov Chain Definition Question

31 Views Asked by At

This is a very basic question about Markov chains, but I am desperately lost. We are given the following definition of a Markov chain:

$P(X_{n+1}\in A | X_1 = x_1, ..., X_n=x_n) = P(X_{n+1} \in A |X_n = x_n)$

for all measurable sets $A$.

Show that $P(X_{n+1}\in A | X_1 = x_1, ..., X_{n-k}=x_{n-k}) = P(X_{n+1}\in A | X_{n-k}=x_{n-k})$ for all $k \ge 1$.

I started by trying to condition on $X_n$ and proceed from there (assuming the $X_i$ all have densities $f$), but I've gotten nowhere:

$$P(X_{n+1}\in A | X_1 = x_1, ..., X_{n-k}=x_{n-k})$$

$$ = \int P(X_{n+1}\in A, X_n = x_n | X_1 = x_1, ..., X_{n-k}=x_{n-k})f(x_n)dx_n$$

$$ = \int P(X_{n+1}\in A| X_n = x_n, X_1 = x_1, ..., X_{n-k}=x_{n-k})P(X_n = x_n| X_1 = x_1, ..., X_{n-k}=x_{n-k})f(x_n)dx_n$$