Discrete and continuous Girsanov

1.8k Views Asked by At

I'm trying to write a proof of the Girsanov theorem based on a discrete version of it.

Discrete version

Suppose that I have a random vector $X$ and two equivalent probability measures $\mathbb{P}, \mathbb{Q}$. In $\mathbb{P}$, $X$ is an uncorrelated multivariate normal random variable. In $\mathbb{Q}$, $X$ is an uncorrelated multivariate normal random variable with mean $0$ (and the same variance). The goal is to find the change of measure $\frac{d\mathbb{Q}}{d\mathbb{P}} : \Omega \to \mathbb{R}$ to make this happen.

So in symbols, we have

$$ X_* \mathbb{P} \sim N(\mu, \mathrm{diag}(\sigma))$$ $$ X_* \mathbb{Q} \sim N(0, \mathrm{diag}(\sigma))$$

Writing $\lambda^n$ for the Lebesgue measure on $\mathbb{R}^n$ and writing out the densities we get:

$$ \frac{d X_* \mathbb{Q}}{dX_* \mathbb{P}} (\mathbf{x}) = \frac{d X_* \mathbb{Q}}{dX_* \lambda^n} (\mathbf{x}) \frac{d X_* \lambda^n}{dX_* \mathbb{P}} (\mathbf{x}) \\ = \frac{\prod_{i=1}^n \frac{1}{\sqrt{2 \pi} \sigma_i} \exp\left\{-\frac{x_i^2}{2\sigma_i^2} \right\}} {\prod_{i=1}^n \frac{1}{\sqrt{2 \pi} \sigma_i} \exp\left\{-\frac{(x_i-\mu_i)^2}{2\sigma_i^2} \right\}} \\ = \exp\left\{ \sum_{i=1}^n \frac{-2\mu_ix_i + \mu_i^2}{2\sigma_i^2} \right\} \\ = \exp\left\{ - \sum_{i=1}^n \frac{\mu_ix_i}{\sigma_i^2} + \frac{1}{2} \sum_{i=1}^n\frac{\mu_i^2}{\sigma_i^2} \right\} $$

Now let $B \in \mathcal{B}^n$ a Borel set in $\mathbb{R}^n$.

$$ \mathbb{Q} (X^{-1} (B)) = X_* \mathbb{Q} (B) \\ = \int_B \frac{dX_*\mathbb{Q}}{dX_*\mathbb{P}} dX_*\mathbb{P} \\ = \int_{X^{-1}(B)} \left( \frac{dX_*\mathbb{Q}}{dX_*\mathbb{P}} \circ X \right) d\mathbb{P} $$

So it is sufficient to choose $\frac{d\mathbb{Q}}{d\mathbb{P}} = \frac{dX_*\mathbb{Q}}{dX_*\mathbb{P}} \circ X$.

Therefore

$$\frac{d\mathbb{Q}}{d\mathbb{P}} = \exp\left\{ - \sum_{i=1}^n \frac{\mu_iX_i}{\sigma_i^2} + \frac{1}{2} \sum_{i=1}^n\frac{\mu_i^2}{\sigma_i^2} \right\}$$.

This result is what I would call the "discrete Girsanov formula". My question is whether it is possible to prove the continuous version as a limit of this one.

Continuous version

$X(t) = W(t) + \int_0^t \Theta(u) du$ where $W(t)$ is a $\mathbb{P}$ Brownian motion and $\Theta(u)$ is an adapted process. Assuming that $X(t)$ is a $\mathbb{Q}$ Brownian motion,

$$ \frac{d\mathbb{Q}}{d\mathbb{P}} = \exp\left\{ \int_0^t \Theta(u) dW(u) + \frac{1}{2} \int_0^t \Theta(u)^2 du \right\} \\ = \exp\left\{ -\int_0^t \Theta(u) dX(u) + \frac{1}{2} \int_0^t \Theta(u)^2 du \right\}$$

(This statement is paraphrased from Shreve's Stochastic Calculus for Finance, and is probably missing the $L^2$ condition)

It looks an awful lot like the discrete version!

My question is: is it possible to pass from the discrete to the continuous version by a partitioning argument? It doesn't have to be super-rigorous (I don't even fully understand the construction of Brownian motion).

1

There are 1 best solutions below

0
On BEST ANSWER

For a discrete version of Girsanov's theorem with adapted drift you need to consider a sequence $$ X_n = X_{n-1} + \mu_n +\sigma_n \epsilon_n, $$ where $\{\epsilon_n\}$ are iid standard Gaussian variables, and $\{\mu_n,\sigma_n\}$ are predictable (for simplicity, let $\mu_n$ and $\mu_n/\sigma_n$ also be bounded). In order to construct the martingale density, consider the density process $$ Z_n = \exp\left\{-\sum_{k=1}^n\frac{\mu_k}{\sigma_k} \epsilon_k -\sum_{k=1}^n\frac{\mu_k^2}{2\sigma_k^2}\right\}. $$ This is a martingale. Moreover, it is not hard to check that $$ E[X_n Z_n\mid\mathcal F_{n-1}] = X_{n-1}Z_{n-1},n\ge 1. $$ Therefore, $$ E^N[X_n\mid\mathcal F_{n-1}] = X_{n-1}, n=1,\dots,N, $$ where the expectation is taken w.r.t. the measure $P^N$ with density $\frac{dP^N}{dP} = Z_N$.

This will be your discrete-time analogue of the Girsanov theorem. Now in order to proceed to a continuous time version you should take $\mu_n = \frac{\mu^N(n/N)}{N}$, $\sigma_n = \frac{\sigma^N(n/N)}{\sqrt{N}}$ so that $\sum_{n=1}^N \mu^N(n/N) \mathbf{1}_{[(k-1)/N,k/N)}(t)$ and $\sum_{n=1}^N \sigma^N(n/N) \mathbf{1}_{[(k-1)/N,k/N)}(t)$ converge to $\mu(t)$ and $\sigma(t)$ respectively, as $N\to\infty$. Then the correspondent process $X^N(t) = \sum_{n=1}^N X^N_n \mathbf{1}_{[(k-1)/N,k/N)}(t)$ will converge to $X(t) = X(0) + \int_0^t \mu(s) ds + \int_0^t \sigma(s) dW(s)$, while the Girsanov density $Z_N$ will converge to $$ \exp\left\{-\int_0^1 \frac{\mu(t)}{\sigma(t)} dW(t) -\int_0^1\frac{\mu(t)^2}{2\sigma(t)}dt\right\}. $$ With respect to this density, $X$ will be a martingale, and $W(t) + \int_0^t \Theta(s) ds$ with $\Theta = \mu/\sigma$ will be a standard Wiener process.