Is this a Markov chain property

60 Views Asked by At

For $A,B$ measurable sets and $(X_n)_n$ a Markov chain. Do any of the following properties hold?

$$P(X_2 \in B | X_1=x_1,X_0 \in A) = P(X_2 \in B|X_1=x_1)$$ or $$P(X_2 \in B|X_1 \in A,X_0=x_0) = P(X_2 \in B|X_1 \in A)?$$

If anything is unclear, please let me know.

1

There are 1 best solutions below

0
On BEST ANSWER

Yes, both follow almost trivially from the definition.