Simple or Strong Markov Property when conditioning on value of stopping time?

236 Views Asked by At

Suppose I have a discrete-time Markov Chain $(X_n)_{n \geq 0}$ with the hitting time $H_x:= \inf \{n \geq 0 \colon X_n = x\}$ for some $x \in E$, where $E$ is a countable state space.

Consider now

$E[1(H_x < \infty) f(X_n)]$ and suppose that we can write $f(X_n)= g(X_n) \circ \theta_m$ where $\circ \theta_m$ denotes the composition with a shift operator (for whatever reason), then we have

$$E[1(H_x < \infty) f(X_n)]= \sum_{m =1}^{\infty} E[1(H_x = m) f(X_n)] = \sum_{m=1}^{\infty} E[1(H_x=m) g(X_n) \circ \theta_m].$$

When I want to apply the Markov property to it, is it the simple or the strong one, as I have that $H_x=m$, so a stopping time takes on this value, but we also have a deterministic value. I hope my question is clear! Thank you very much.

1

There are 1 best solutions below

1
On BEST ANSWER

Having broken things down according to the value of $H_x$, you can use the simple Markov property on each term in the sum on the far right — the event $\{H_x=m\}$ is $\mathcal{F}_m$-measurable because $H_x$ is a stopping time .

Alternatively, you could write $f(X_n)$ as $g(X_n)\circ\theta_{H_x}$ on $\{H_x<\infty\}$ and use the strong Markov property at the stopping time $H_x$.