Prove that $(\tau_x, x \geq 0)$ is markov process

67 Views Asked by At

Let $(W_t, t \geq 0)$ be a Wiener process, and $τ_x = min\left\{t : W_t = x\right\}$ for $x \geq 0$. Prove that the process $τ = (τ_x, x \geq 0)$ is Markov. Could anybody give a hint to approach this problem? Thank you!

My attempt: to prove that $τ = (τ_x, x \geq 0)$ is markov process, I need to prove that: $$ \mathbb{P}(\tau_x \leq t | \tau_{y_1}, \tau_{y_2}, ..., \tau_{y}) = \mathbb{P}(\tau_x \leq t | \tau_{y}) $$ $\forall y_1 < y_2 < ... < y_n < y < x $ and $\forall t \geq 0$. I could rewrite the equality above: $$ \mathbb{P}(\tau_x \leq t | \tau_{y_1}, \tau_{y_2}, ..., \tau_{y}) = \mathbb{P}(x \leq W_t | \tau_{y_1}, \tau_{y_2}, ..., \tau_{y}) $$ Then i tried to prove that event $W_t \in [x, +\infty)$ is independent from $\sigma$-algebra $\sigma\left\{\tau_{y_1}, \tau_{y_2}, ..., \tau_{y}\right\}$. At this point, I stopped, because I can not prove it.

1

There are 1 best solutions below

0
On BEST ANSWER

It's proven in Morter's Peres Theorem 2.35 that $\tau_{x}$ is a process which satisfies the Markov Property.

Now if $X_{t}$ be a stochastic process with the markov property (wrt it's natural filtration) , i.e. $(X_{t+s}-X_{s})_{t\geq 0}$ be independent of $\mathcal{F}_{s}=\sigma(X_{t}:t\leq s)$ , then $X_{t}$ will be a Markov Process.

Let $u$ be bounded and measurable.

Define $\Phi:\mathbb{R}\times \mathbb{R}\to \Bbb{R}$ by $\Phi(X,Y)=u(X+Y)$ for measurable maps $X:\Omega\to C$ and $Y:\Omega\to D$ where C and D are some measurable spaces (I don't want to get into more notations) .

Then it can be shown using this or see (Rene Schilling's Brownian Motion Lemma $A.3$ ) that if $X$ is $\mathcal{X}$ measurable and $Y$ is $\mathcal{Y}$ meaurable with $\mathcal{X}$ and $\mathcal{Y}$ being independent sigma algebras, then $E(\Phi(X,Y)|\mathcal{X})=E(\Phi(x,Y))\vert_{X=x}=E(\Phi(X,Y)|X)$ for each bounded measurable $\Phi:C\times D\to \Bbb{R}$.

Now use this with $X=X_{s}$ , $\mathcal{X}=\sigma(X_{t}:t\leq s)$ , $Y=X_{t+s}-X_{s}$ and $\Phi$ as I defined above to directly get that $(X_{t})$ is a Markov Process i.e. $E(u(X_{t+s})|\mathcal{F}_{s})=E(u(X_{t+s})|X_{s})$