I've been working on a problem and have made some progress. I believe my hurdle is my weakness in Girsanov's Theorem, so I figured that I ought to ask for pointers and to see if I'm on the right track.
Let $B_t$ be a standard Brownian motion starting at $B_0 = x > 0$. Let $T_a = \inf\{t~:~B_t = a\}$ be the first hitting time of level $a$. Find
\begin{equation*} \mathbb E^x\left[\exp\left({-\int^{T_a}_0 \frac{1}{B_s^2} ds}\right)\right] \equiv \mathbb E \left[ \exp\left({-\int^{T_a}_0 \frac{1}{B_s^2} ds}\right) ~\middle|~ B_0 = x \right] \end{equation*}
for $0 < a < x$. It seems as if a good bet would be to first find some martingale $M_t$ related to our exponential and use optional sampling with our stopping time $T_a$ to find \begin{equation*} \mathbb E[M_T] = \mathbb E[M_0] \end{equation*}
(of course prior to this we should confirm that the conditions of the optional sampling theorem are met). However, it's not obvious to me how to go about finding this martingale. Some martingales I've played around with include:
- $e^{B_t - \frac{t}{2}}$
- $e^{\lambda B_t - \frac{\lambda^2}{2} t}$
- $\exp \left({\int^t_0 \theta(s) dB_s - \frac{1}{2} \int^t_0 \theta^2(s) ds}\right)$
I can't seem to fiddle around with the martingales and the desired expectations in such a way that ends up working. My hunch is that my weakness stems from a poor understanding of how to actually use Girsanov's Theorem. That is, it seems more likely that multiplying our exponential by some factor $R_t$ would generate a martingale than subtracting some drift term, although I'm unclear on how to do so.
Thank you!
-----Edit 1-----
I've managed to whittle away at a solution (I think) using Feynman-Kac, although I'm unsatisfied with this method since the problem asked to "use martingale methods".
In any case, to sum up my methodology we start by restating the Feynman-Kac Theorem: If $Z_t$ is a diffusion process with dynamics \begin{equation*} dZ_t = a(x)dt + b(x) dB_t \end{equation*}
then the expectation \begin{equation*} \mathbb E^x \left[ f(Z_t) \exp \left( \int^t_0 \phi(Z_s) ds \right) \right] \end{equation*}
has the solution \begin{equation*} V(t,x) = \mathbb E^x \left[ f(Z_t) \exp \left( \int^t_0 \phi(Z_s) ds \right) \right] \end{equation*}
which solves the PDE \begin{equation*} V_t(t,x) = \frac{1}{2}b^2(x) V_{xx}(t,x) + a(x) V_x(t,x) + \phi(x) V(t,x) \end{equation*}
with initial condition \begin{equation*} V(0,x) = \phi(x) \end{equation*}
So, in our case we're blessed with relatively simple terms \begin{align*} a(x) &= 0 \\ b(x) &= 1 \\ \phi(x) &= -\frac{1}{x^2} \\ f(x) &= 1 \end{align*}
Therefore our PDE to solve takes the form \begin{equation*} V_t = \frac{1}{2} V_{xx} - \frac{1}{x^2} V \end{equation*}
We may use separation of variables to solve this PDE. Letting $V$ be a simple product of univariate functions of its arguments $V(t,x) = X(x)T(t)$ and $\lambda$ our constant of separation, we find \begin{equation*} \frac{T'(t)}{\frac{1}{2}T(t)} = \lambda \implies T(t) = k_0 e^{\frac{1}{2}\lambda t} \end{equation*}
and \begin{equation*} x^2 X''(x) + \left(2 - \lambda x^2\right)X(x) = 0 \implies X(x) = k_1 x^{\frac{1}{2} + \sqrt{\frac{1}{4}+2-\lambda x^2}} + k_2 x^{\frac{1}{2} - \sqrt{\frac{1}{4}+2-\lambda x^2}} \end{equation*}
so that our solution takes the form \begin{align*} \mathbb E^x\left[\exp\left({-\int^{t}_0 \frac{1}{B_s^2} ds}\right)\right] = V(t, x) &= T(t)X(x) \\ &= e^{\frac{1}{2}\lambda t} \left( c_1 x^{\frac{1}{2} + \sqrt{\frac{1}{4}+2-\lambda x^2}} + c_2 x^{\frac{1}{2} - \sqrt{\frac{1}{4}+2-\lambda x^2}} \right) \end{align*}
The only way to satisfy our initial condition $V(0,x) = f(x) = 1$ is set $\lambda = 0$. This not only rids us of the exponential term but also considerably simplifies our expression for $X(x)$ yielding \begin{align*} \mathbb E^x\left[\exp\left({-\int^{t}_0 \frac{1}{B_s^2} ds}\right)\right] = V(t, x) &= c_1 x^{\frac{1}{2} + \sqrt{\frac{1}{4}+2}} + c_2 x^{\frac{1}{2} - \sqrt{\frac{1}{4}+2}} \\ &= c_1 x^{\frac{1}{2} + \sqrt{\frac{9}{4}}} + c_2 x^{\frac{1}{2} - \sqrt{\frac{9}{4}}} \\ &= c_1 x^{2} + c_2 x^{-1} \\ \end{align*}
Finally, we return to our problem with stopping time $T_a$. Evaluating our solution at $t = T_a$ realizes the general expression for $V(T_a, x)$ (Question: Are we free to substitute $T_a$ into our solution? Does this fact that $T_a$ is a random variable impose some restrictions on the Feyman-Kac method?) \begin{equation*} \mathbb E^x\left[\exp\left({-\int^{T_a}_0 \frac{1}{B_s^2} ds}\right)\right] V(T_a, x) = c_1 x^{2} + c_2 x^{-1} \end{equation*}
which isn't particularly exciting since this is identical to $V(t, x)$. However, substituting $T_a$ into our solution has given us boundary conditions \begin{align*} V(T_a,a) &= 1 \\ \lim_{x\to\infty} V(t,x) &= 0 \end{align*}
which gives us $c_1 = 0$ and $c_2 = a$ to yield \begin{equation*} \mathbb E^x\left[ \exp \left(-\int^t_0 \frac{1}{B_s^2} ds \right)\right] = V(T_a,x) = \frac{a}{x} \end{equation*}
Interestingly, and I hadn't noticed this until now, our solution with respect to $t$ seems to be independent of $t$. Does this mean this gives us our solution with respect to the stopping time $T_a$? If not, I'm not quite sure where to move from here. The reason I had found $V(t, x) = \frac{a}{x}$ (which is obviously independent of $t$) was because I used the boundary conditions specific to the stopping time $T_a$, not boundary conditions related to a general $t$.
As per the tips from Shalop I believe that I have solved the problem within the scope of "using martingale methods".
First, consider $t$ such that $0 \leq t \leq T_a$ and let $Z_t$ be our exponential \begin{equation*} Z_t = \exp \left( -\int^{t}_0 \frac{1}{B_s^2} ds \right) \end{equation*}
We may note that the expectation $\mathbb E^x \left[ Z_t \right]$ is reasonably similar to the exponential martingale \begin{equation*} \Theta_t = \exp\left(-\int^t_0 \theta(s) dB_s - \frac{1}{2} \int^t_0 \theta^2(s) ds\right) \end{equation*}
with $\theta(s) = \frac{1}{B_s}$. So, our motivation is to somehow connect this martingale with $Z_t$ for use in the optional sampling theorem with stopping time $T_a$. We try $\log B_t$ for use in Ito's lemma: \begin{align*} \log B_t &= \log B_0 + \int^t_0 \frac{1}{B_s} dB_s - \frac{1}{2} \int^t_0 \frac{1}{B_s^2} ds \\ \implies -\int^t_0 \frac{1}{B_s} dB_s &= \log \frac{B_0}{B_t} - \frac{1}{2} \int^t_0 \frac{1}{B_s^2} ds \end{align*}
Serendipitously, if we subtract both sides by $\frac{1}{2}\int^t_0 \frac{1}{B_s^2} ds$ and then exponentiate we not only produce the exponential martingale $\Theta_t$, but we also have a term on the right-hand-side that is identical to $Z_t$: \begin{align*} -\int^t_0 \frac{1}{B_s} dB_s &= \log \frac{B_0}{B_t} - \frac{1}{2} \int^t_0 \frac{1}{B_s^2} ds \\ \implies -\int^t_0 \frac{1}{B_s} dB_s - \frac{1}{2} \int^t_0 \frac{1}{B_s^2} ds &= \log \frac{B_0}{B_t} -\int^t_0 \frac{1}{B_s^2} ds \\ \implies \exp \left( -\int^t_0 \frac{1}{B_s} dB_s - \frac{1}{2} \int^t_0 \frac{1}{B_s^2} ds \right) &= \exp \left( \log \frac{B_0}{B_t} -\int^t_0 \frac{1}{B_s^2} ds \right) \\ \iff \Theta_t &= \frac{B_0}{B_t} Z_t = \frac{x}{B_t} Z_t \end{align*}
Using the martingale property of $\Theta_t$ we take the expectation of the left-hand-side (conditional on $B_0 = x$, although this doesn't affect the value of the LHS) \begin{equation*} \mathbb E^x\left[\Theta_t\right] = \mathbb E^x\left[\Theta_t \middle| \mathcal F_0 \right] = \Theta_0 = 1 \end{equation*}
Now, if we wish to consider expectation stopped at $T_a = \inf\{t ~:~ B_t = a\}$ we should verify the conditions of the optional sampling theorem on $\Theta_{T_a}$. In particular, for 1-dimensional Brownian motion it is indeed true that \begin{equation*} \mathbb P(T_a < \infty) = 1 \end{equation*}
Additionally \begin{equation*} \mathbb E\left[ \left| \Theta_{T_a} \right| \right] < \infty \end{equation*}
The boundedness of $\mathbb E^x[|\Theta_{T_a}|]$ follows from the fact that our Brownian process starts at level $x$ such that $0 < a < x$ and will stop at level $a$. This implies that both integrands in $\Theta_{T_a}$ will be bounded and positive, informing us that $\Theta_{T_a}$ will be bounded, and so our expectation of $|\Theta_{T_a}|$ will be bounded. Lastly, \begin{equation*} 0 \leq \lim_{t\to\infty} \mathbb E^x\left[ \mathbb I_{\{|T|>t\}} \Theta_t \right] \leq \lim_{t\to\infty} \mathbb E^x \left[ \Theta_t\right] = 0 \end{equation*}
and so optional sampling applies. Thus \begin{equation*} \mathbb E^x\left[ \Theta_{T_a} \right] = \mathbb E^x \left[ \Theta_0 \right] = \Theta_0 = 1 \end{equation*}
Hence \begin{align*} 1 = \mathbb E^x\left[ \Theta_{T_a} \right] &= \mathbb E^x\left[ \frac{x}{B_{T_a}} Z_{T_a} \right] \\ &= \mathbb E^x \left[ \frac{x}{a} Z_{T_a}\right] \\ \implies \mathbb E^x \left[ Z_{T_a} \right] &= \frac{a}{x} \end{align*}
as desired.