Strong Markov Property for Markov Chains - Statement Verification

453 Views Asked by At

I suspect that my handwritten lecture notes for the Strong Markov Property are wrong. I'd appreciate corrections to them.

We first define the following:

A random variable $\tau$ is called a stopping time if $\{\tau \leq n\}$ depends only on $X_0, X_1, \ldots X_n$.

We now state the theorem:

Suppose that we have a Markov chain $(X_n)$ with transitional probability matrix $P$, initial distribution $\lambda$, and stopping time $\tau$. Let $f = f(X_\tau, X_{\tau +1}, X_{\tau + 2}, \ldots)$ be a function.Then $\mathbb{E}_\lambda[f(X_\tau, X_{\tau + 1}, X_{\tau + 2}, \ldots) \mid X_0, X_1, \ldots, X_\tau] = \mathbb{E}_\color{red}{X_\tau}[f(X_0, X_1, X_2, \ldots)]$.

Marked in red is what I think is wrong. Should this instead be $\tau$? Are there any other mistakes in the statement of this theorem?

1

There are 1 best solutions below

0
On BEST ANSWER

The statement is correct with $X_\tau$, but you have to read it carefully. The LHS is a conditional expectation, hence a random variable, so for some $\omega \in \Omega$ you have that $\mathbb{E}(f(X_{\tau+1}, \ldots) \mid \mathcal{F}_\tau) = \mathbb{E}_{X_{\tau}(\omega)}(f(X_0, \ldots))$, i.e. you can restart your Markov chain at some random point and evaluate the conditional expectation by knowing the current state $X_\tau(\omega)$.

See also for example Durrett, Probability - Theory and Examples, section 6.3.