Brownian Motion hitting times in one dimension

148 Views Asked by At

Let $(X_t)_{t\geq0}$ be a Brownian Motion starting from $0$. Define for $a\in\mathbb{R}$, $$T_a = \inf\{t\geq0:X_t=a\}.$$ For $a,b>0$, show that $\mathbb{P}(T_{-a}<T_b)=\frac{b}{a+b}$. I really don't have any idea how to show this formally (intuitively it makes sense), so any help would be greatly appreciated!

1

There are 1 best solutions below

1
On

The solution is to let $T=T_{-a}\wedge T_b$ consider the stopped process $X^T$. $X^T$ is uniformly integrable as it is bounded between $-a$ and $b$, and so we can use the optional stopping theorem for non-bounded stopping times: $$\mathbb{E}(X_T^T)=\mathbb{E}(X_0^T)=0.$$

Then $$\mathbb{E}(X_T^T)=-\mathbb{P}(T_{-a}<T_b)a+\mathbb{P}(T_{-a}>T_b)b$$ and solving using that $\mathbb{P}(T_{-a}>T_b) = 1 - \mathbb{P}(T_{-a}<T_b)$ gives the result.