Equivalent Markov property for a continous space, discrete time Markov chain

41 Views Asked by At

Suppose $Y_n = \sum_{k = 1}^{n} X_k$ is a random walk on $\mathbb{R}$ where the $X_k$ are i.i.d. random variables. I know that $Y_n$ is a discrete time Markov chain on the continous space $\mathbb{R}$ and so by definition $\forall A\in\mathcal{B}(\mathbb{R})$, $\forall n\geq 2$: $$ P(Y_n\in A|Y_{n-1} = y_{n-1},Y_{n-2} = y_{n-2},\dots,Y_1 = y_1) = P(Y_n\in A|Y_{n-1} = y_{n-1}), $$ and equivalently $\forall A,B_{n-2},\dots,B_1\in\mathcal{B}(\mathbb{R})$ $$ P(Y_n\in A|Y_{n-1} = y_{n-1},Y_{n-2} \in B_{n-2},\dots,Y_1 \in B_1) = P(Y_n\in A|Y_{n-1} = y_{n-1}). $$ I would like to prove or disprove that under the previous hypothesis: $$ P(Y_n\in A|Y_{n-1} \in B_{n-1},Y_{n-2} \in B_{n-2},\dots,Y_1 \in B_1) = P(Y_n\in A|Y_{n-1} \in B_{n-1}) $$ I think that this fact is true in the discrete case (by breaking those probabilities using discrete sums) , but the continous case is giving me some trouble. Any hint will be appreaciated. Thank you for your help!

1

There are 1 best solutions below

1
On BEST ANSWER

It is false even in the discrete case.

Consider for example the case where $B_{n-1}$ is the whole state space, so that the event $Y_{n-1}\in B_{n-1}$ has probability $1$ and conditioning on it makes no difference. Your property would imply $P(Y_n\in A | Y_{n-2}\in B_{n-2},\dots, Y_1\in B_1)=P(Y_n\in A)$, which of course is not true in general.

Or for a different specific example consider a deterministic process on three states with transitions 1 -> 2 -> 3 -> 1 -> etc. Compare $P(X_3=3 | X_2\in\{2,3\}, X_1=1)$ with $P(X_3=3| X_2\in\{2,3\}, X_1=2)$.