Possibly broken definition of the strong Markov property

117 Views Asked by At

Let

  • $I\subseteq [0,\infty)$ be closed under addition and $0\in E$
  • $E$ be a Polish space and $\mathcal E$ be the Borel $\sigma$-algebra on $E$
  • $X=(X_t)_{t\in I}$ be a Markov process with values in $(E,\mathcal E)$ and distributions $(\operatorname P_x)_{x\in E}$
  • $\mathbb F=(\mathcal F_t)_{t\in I}$ be the filtration generated by $X$

$X$ is said to have the strong Markov property $:\Leftrightarrow$ For all almost surely finite $\mathbb F$-stopping times $\tau$, $x\in E$ and bounded, $\mathcal E^{\otimes I}$-measurable $f:E^I\to\mathbb R$, $$\operatorname E_x\left[f\left(\left(X_{\tau+t}\right)_{t\in I}\right)\mid\mathcal F_\tau\right]=\operatorname E_{X_\tau}\left[f\left(X\right)\right]\;\;\;\operatorname P_x\text{-almost surely}\;,\tag 1$$ where $\mathcal F_\tau:=\left\{A\in\mathcal A:A\cap\left\{\tau\le t\right\}\in\mathcal F_t\;\text{for all }t\in I\right\}$.

I'm curious about two things:

  1. What's the reason to force $\tau$ to be almost surely finite? What's meant by almost surely at all (with respect to which probability measure?), in this context?
  2. Unless $\tau$ is $\operatorname P_x$-almost surely finite, the integrand on the left and the expression on the right side of $(1)$ seem to undefined on $\left\{\tau=\infty\right\}$

So, is the given definition broken? If that's the case: What do we need to change to fix it?

2

There are 2 best solutions below

2
On

"... with respect to what probability measure?"

What isn't said outright is that we need to have $P_x(\tau < \infty) = 1$ for every $x \in E$. Almost surely means that the samples of $\{X_t\}$ (independent of the starting point, $x$) for which $\tau < \infty$ make up a set of probability one.

But you seem to be answering your own question. In (1) you ask why the stopping time must be finite, then in (2) you state that the definition makes no sense when $\tau$ isn't finite. That seems like a good reason to impose the condition.

0
On
  1. In this context, "almost surely" means "with $P_x$ probability $1$, for each $x\in E$".

  2. Another common (and equivalent) version of the definition is to ask for the equality in (1) to hold $P_x$-a.s. on the event $\{\tau<\infty\}$ (for each $x\in E$). ($\tau$ being an arbitrary stopping time, not necessarily finite.)