Using the Strong Markov Property to deduce equality in distribution

136 Views Asked by At

Let $a, b > 0$, let $B_t$ be a Brownian motion, and let $H_a,H_b$ be the first hitting times of $a,b$ respectively, e.g. $H_a := \inf \{ t : B_t = a \}$.

Let $S_a, S_b$ be independent random variables on the same probability space distributed as $H_a, H_b$ respectively. How can one prove that $S_a + S_b$ has the same distribution as $H_{a+b}$ making use of the strong Markov property of $B_t$?

My attempt: Define $B_s' := B_{S_a + s} - B_{S_a}$. By the strong Markov property, this too is a Brownian motion, $B'_s$ independent of $\mathcal{F}_{S_a}$. Then \begin{align*} B'_{S_b} &= b = B_{S_a + S_b} - B_{S_a} \\ &\implies B_{S_a + S_b} = b+B_{S_a} = b+a\\ & \quad \quad \quad \text{ by continuity of Brownian sample paths}. \end{align*}

... but I struggle to conclude the equality in distribution from here.

2

There are 2 best solutions below

0
On BEST ANSWER

Because of the strong Markov property, the process

$$W_t := B_{t+H_a}-B_{H_a} = B_{t+H_a}-a, \qquad t \geq 0,$$

is a Brownian motion which is independent of $\mathcal{F}_{H_a}$. If we define

$$H_b^{(W)} := \inf\{t \geq 0; W_t=b\},$$

then $H_b^{(W)}=H_b$ in distribution and $H_b^{(W)}$ is independent of $H_a$. Moreover,

\begin{align*} H_{a+b} &= \inf\{t \geq 0; B_t = a+b\} \\ &= \inf\{t \geq H_a; B_t = B_{H_a}+b\} \\ &\stackrel{t=H_a+u}{=} H_a + \inf\{u \geq 0; B_{u+H_a}-B_{H_a}=b\} \\ &= H_a + H_b^{(W)}. \tag{1} \end{align*}

Now take any two independent random variables $S_a$ and $S_b$ (defined on the same probability space) such that $S_a = H_a$ in distribution and $S_b=H_b$ in distribution. Then the vector $(S_a,S_b)$ has the same distribution as $(H_a, H_b^{(W)})$; in particular, $f(S_a,S_b)=f(H_a,H_b^{(W)})$ in distribution for any measurable function $f$. If we choose $f(x,y) := x+y$, it follows from $(1)$ that

$$S_a+S_b = f(S_a,S_b) = f(H_a,H_b^{(W)}) = H_a+H_b^{(W)} = H_{a+b} \quad \text{in distribution}.$$


Remark on your attempt: $S_b = H_b$ in distribution does not imply that $B_{S_b}' = b$ (just consider e.g. $S_b = H_{-b}^{(B')}$, then $S_b = H_b$ in distribution and $H_a$ and $S_b$ are independent but $B_{S_b}' = -b$); consequently, your approach doesn't work.

0
On

Here is a less direct and less insightful approach. Perhaps the most famous consequence of the strong Markov property is the reflection principle, in the form: \begin{equation} P[S_t \geq a, B_t \leq b] = P[B_t \geq 2a - b] \end{equation} where we define $S_t = \sup_{u \leq t} B_u$. A consequence of this is that $S_t$ and $|B_t|$ have the same distribution, seen by taking $a = b$ in the above. Therefore: \begin{equation} P[H_a \leq t] = P[S_t \geq a] = P[B_{t}^{2} \geq a^2] = P\left[\frac{a^2}{B_{1}^{2}} \leq t\right]. \end{equation} Therefore $H_a$ and $\frac{a^2}{B_{1}^{2}}$ must have the same distribution, and upon differentiating the pdf turns out to be (for $s>0$ of course) \begin{equation} f_{a}(s) = \dfrac{a}{\sqrt{2 \pi s^3}} \exp \bigg[ -\dfrac{a^2}{2s} \bigg]. \end{equation} As we now know the densities of $S_a$ and $S_b$, we may use a convolutional result to find the distribution of their sum.