Understanding the Markov property of Brownian motion

584 Views Asked by At

I'm trying to understand the Markov property for Brownian motions in full generality. The textbook I'm following states it like this:

Recall that we have a family of measures $P_x, x \in \mathbb{R}$ on $(C,\mathcal{C})$ so that under $P_x, B_t(\omega) = \omega(t)$ is a Brownian motion starting at $x$. For $s \geq 0$, we define the shift transformation $\theta_s: C \to C$ by $$(\theta_s \omega)(t) = \omega(s+t) \quad \textrm{for} \quad t \geq 0$$ Theorem 8.2.1: If $s \geq 0$ and $Y$ is bounded and $\mathcal{C}$-measurable, then for all $x \in \mathbb{R}$ $$E_x(Y \circ \theta_s | \mathcal{F}_s^+) = E_{B_s}Y$$ where the right-hand side is the function $\phi(x) = E_xY$ evaluated at $x = B_s$.

I think I understand the general idea, but there are still 2 gaps which make me think I'm fundamentally missing something here:

1) I'm confused by the $x$ subscript on $E_x$ in the left hand side. I understand that this is expectation with respect to $P_x$, but then shouldn't this also appear on the right side somewhere?

2) I'm trying to figure out what function $Y: C[0,\infty) \to \mathbb{R}$ (as in the theorem statement above) would allow me to state this Markov property in terms of a simpler function $f: \mathbb{R} \to \mathbb{R}$, involving for example $f(B_{t+s} - B_s)$. I think this is something like $Y(\omega(\cdot)) = \omega(t)$ but I'm pretty shaky on this.

1

There are 1 best solutions below

3
On BEST ANSWER
  1. In some sense, there is an $x$ on the right-hand side because the equality $$\mathbb{E}_x(Y \circ \theta_s \mid \mathcal{F}_s^+)(\omega) = \mathbb{E}_{B_s(\omega)}(Y)$$ holds only $\mathbb{P}_x$-almost surely.
  2. For a function $f: \mathbb{R} \to \mathbb{R}$ the Markov property reads $$\mathbb{E}_x (f(B_{t+s}) \mid \mathcal{F}_s^+) = \mathbb{E}_{B_s} f(B_{t}), \qquad t \geq 0.$$ This implies in particular $$\mathbb{E}_x (f(B_{t+s}) \mid \mathcal{F}_s^+) = \mathbb{E}_{x}(f(B_{t+s}) \mid B_s);$$ this means that the future does not depend on the past, but only on the current position of the process.