Mean and variance of this random variable

227 Views Asked by At

How can we compute the mean and variance of $e^{W_tW_s} $ where $(W_t)_{t \geq 0} $ is a Brownian motion?

If we want to compute $ \mathbb{E}(W_tW_s) $, the usual thing to do is to assume that $ s \leq t$ and write $W_tW_s = (W_t - W_s)W_s + W_s^2$, then we can use independence of increments to finish that computation, but due to the exponential function, that trick does not work here unless we can write $W_tW_s$ as a sum of independent increments, so that $e^{W_tW_s}$ can be written as a product of functions of the independent increments.

Does anyone have any suggestions?

2

There are 2 best solutions below

1
On BEST ANSWER

Recall that $(W_s, W_t)$ are jointly Gaussian with zero mean and covariance matrix, assuming without loss of genarility $s < t$ $$ \begin{pmatrix}\mathbb{Cov}(W_s,W_s) & \mathbb{Cov}(W_s,W_t) \\ \mathbb{Cov}(W_t,W_s) & \mathbb{Cov}(W_t,W_t) \end{pmatrix} = \begin{pmatrix}s & s \\ s & t \end{pmatrix} $$ We thus faced with finding an expectation of a function of two random variables. To make it simple, we recall that the Gaussian random variable $V = W_t-W_s$ is independent of $W_s$. Thus, usign $(W_s, W_t) \stackrel{\text{law}}{=} (W_s, W_s+V)$, we have $$ \mathbb{E}\left(\exp\left(W_s W_t\right)\right) = \mathbb{E}\left(\exp\left(W_s \left(W_s + V\right)\right)\right) = \mathbb{E}\left(\exp\left(W_s^2 \right) \exp\left(W_s V \right) \right) $$ Now, making use of the law of the total expectation: $$\begin{eqnarray} \mathbb{E}\left(\exp\left(W_s^2 \right) \exp\left(W_s V \right) \right) &=& \mathbb{E}\left( \mathbb{E}\left(\exp\left(W_s^2 \right) \exp\left(W_s V \right) |W_s \right) \right) \\ &=& \mathbb{E}\left( \exp\left(W_s^2 \right) \mathbb{E}\left(\exp\left(W_s V \right) |W_s \right) \right) \tag{1} \end{eqnarray} $$ As $V$ is a Gaussian random variable, we have, for any real $w$, $$\mathbb{E}\left( \exp(w V) \right) = \exp\left( \mathbb{E}(V)w + \frac{w^2}{2} \mathbb{Var}(V) \right) = \exp\left( \frac{w^2}{2} \left(t-s\right) \right) $$ Therefore, eq. $(1)$ now reads: $$ \mathbb{E}\left( \exp\left(W_s^2 \right) \mathbb{E}\left(\exp\left(W_s V \right) |W_s \right) \right) = \mathbb{E}\left( \exp\left( W_s^2 \left( 1 + \frac{t-s}{2} \right) \right)\right) $$ Recall that $W_s \stackrel{\text{law}}{=} \sqrt{s} Z$, for standard normal random variable $Z$, thus $W_s^2 \stackrel{\text{law}}{=} s Z^2 \stackrel{\text{law}}{=} s X$, where $X$ follows $\chi^2$-distribution with one degree of freedom. Thus $$\begin{eqnarray} \mathbb{E}\left( \exp\left( W_s^2 \left( 1 + \frac{t-s}{2} \right) \right)\right) &=& \mathbb{E}\left( \exp\left( s X \left( 1 + \frac{t-s}{2} \right) \right)\right) = \mathcal{M}_X\left( s \left( 1 + \frac{t-s}{2} \right) \right) \\ &=& \left. \frac{1}{\sqrt{1-2u}} \right|_{u = s \left( 1 + \frac{t-s}{2} \right)} = \frac{1}{\sqrt{(s-1)^2 - s t}} \end{eqnarray} $$ Of course, the moment generating function is only defined if $(s-1)^2 > s t$.

3
On

$\mathbb E e^{(W_t - W_s)W_s + W_s^2} = \mathbb E(\mathbb E (\text{same}\mid \mathcal F_s) )= $$\mathbb E e^{(t- s)W_s^2 + W_s^2} $ and the square is of the same form.