If $X_t=\int_0^t \sigma _s\,\mathrm d W_s$, how can I prove that $\mathrm d \left<X\right>_t=\sigma _t^2\,\mathrm d t$

149 Views Asked by At

Let $\sigma $ a very nice function. Let $X_t:=\int_0^t \sigma _s\,\mathrm d W_s$. We define $$\left<X\right>_t=\lim_{n\to \infty }\sum_{i=0}^{m_n-1}(X_{t_{i+1}^{(n)}}-X_{t_i^{(n)}})^2$$ where the limit is taken in the probability sense and $0=t_0^{(n)}<t_{1}^{(n)}<...<t_{m_n}^{(n)}=t$ is s.t. $\max_{i=0,...,m_n-1}|t_{i+1}^{(n)}-t_i^{(n)}|\to 0$ when $n\to \infty $.

Attempts : no need to read the steps 1 and step 2, just step 3 is relevant.

  • Step 1: Suppose $\sigma $ is of the form $\sigma _s=Y\boldsymbol 1_{[u,v)}$ where $Y$ is $\mathcal F_u-$measurable. Take $(t_i^{(n)})$ being a partition of $[u,v]$ s.t. $\max_{i}|t_{i+1}^{(n)}-t_i^{(n)}|\to 0$ when $n\to \infty $.

Then \begin{align*} &\mathbb E\left[\left(\sum_{i=0}^{m_n-1}(X_{t_{i+1}^{(n)}}-X_{t_i^{(n)}})^2-\int_0^t\sigma _s^2\,\mathrm d s\right)^2\right]\\ &=\sum_{i=0}^{m_n-1}\sum_{j=0}^{m_n-1}\mathbb E\left[Y^4\Big((W_{t_{i+1}^{(n)}}-W_{t_i^{(n)}})^2-(t_{i+1}^{(n)}-t_i^{(n)})\Big)\Big((W_{t_{j+1}^{(n)}}-W_{t_j^{(n)}})^2-(t_{j+1}^{(n)}-t_j^{(n)})\Big)\right]\\ &=\sum_{i=0}^{m_n-1}E\left[Y^4\Big((W_{t_{i+1}^{(n)}}-W_{t_i^{(n)}})^2-(t_{i+1}^{(n)}-t_i^{(n)})\Big)^2\right]\\ &=2\mathbb E[Y^4]\sum_{i=0}^{m_n-1}(t_{i+1}^{(n)}-t_i^{(n)})^2\\ &\leq 2\mathbb E[Y^4](v-u)\max_{i=0,...,m_n-1}|t_{i+1}^{(n)}-t_i^{(n)}|\underset{n\to \infty }{\longrightarrow }0. \end{align*}

  • Step 2: Now, if $\sigma $ is a predictable process, the step 1 applies.

  • Step 3: If $\sigma $ is more general (i.e. s.t. $\mathbb E\int_0^t \sigma _s^2\,\mathrm d s<\infty $), there is a sequence $(\sigma ^n)$ of predictable process s.t. $\sigma _n\to \sigma $ in $L^2$. In particular $$\int_0^t \sigma ^n_s\,\mathrm d W_s\to \int_0^t \sigma _s\,\mathrm d W_s,\quad \text{in }L^2.$$

Let $(t_i^{(m)})$ a partition of $[0,t]$ s.t. $0=t_0^{(m)}<...<t_{k_m}^{(m)}=t$ s.t. $\max_{i=0,...,k_m-1}|t_{i+1}^{(m)}-t_i^{(m)}|\to 0$ when $m\to \infty $.

I know by step 2 that $$\lim_{m\to \infty }\mathbb E\left[\left(\sum_{i=0}^{k_m-1}\left(\int_{t_{i}^{(m)}}^{t_{i+1}^{(m)}}\sigma _s^{(n)}\,\mathrm d W_s\right)^2-\int_0^t(\sigma _s^n)^2\,\mathrm d s\right)^2\right]=0.$$

From that, how can I prove that

$$\lim_{m\to \infty }\mathbb E\left[\left(\sum_{i=0}^{k_m-1}\left(\int_{t_{i}^{(m)}}^{t_{i+1}^{(m)}}\sigma _s\,\mathrm d W_s\right)^2-\int_0^t(\sigma _s)^2\,\mathrm d s\right)^2\right]=0 \ \ ?$$

If this way doesn't work, maybe is there an other way ?

1

There are 1 best solutions below

3
On BEST ANSWER

Let's assume $E\int_0^T \sigma_s^2\,ds < \infty$. Then the usual way to show that $\langle X \rangle_t = \int_0^t \sigma_s^2\,ds =: A_t$ is by showing that $X_t := \int_0^t \sigma_s dW_s$ is a martingale, and that $X^2 - A$ is a martingale. This property characterizes the quadratic variation and will prove your claim.

Take simple $\sigma^{(n)} \to \sigma$ converging in the sense that $E\int_0^T |\sigma_s-\sigma^{(n)}_s|^2\,ds \to 0$. For simple $\sigma^{(n)}$ the martingale and quadratic variation properties are easy to check. Let $I^{(n)}_t := \int_0^t \sigma^{(n)}_s\,dW_s$ and $A^{(n)}_t := \int_0^t|\sigma^{(n)}_s|^2 \,ds$. First check the martingale property. Fix $s \leq t \leq T$.

$$ E[I^{(n)}_t|\mathcal{F}_s] = I^{(n)}_s $$ By the Ito isometry for simple functions, the RHS tends to $X_s$ in $L^2$, and since the integrand of the LHS converges to $X_t$ in $L^2$ and the conditional expectation is a contraction on $L^p$ for any $p \geq 1$ (use that $E[E[Y | \mathcal{F}]^p]] \leq E[Y^p]$ by conditional Jensen and tower property for any $Y, \mathcal{F}$), we find that $E[I^{(n)}_t|\mathcal{F}_s] \to E[X_t | \mathcal{F}_s]$ in $L^2$ as well, showing $E[X_t |\mathcal{F}_s] = X_s$. Thus $X$ is a martingale.

Similarly we can check the quadratic variation as follows. By the properties for simple integrands, we have $$ E[(I^{(n)}_t)^2 - A^{(n)}_t|\mathcal{F}_s] = (I_s^{(n)})^2 - A^{(n)}_s $$ Then since $(I_s^{(n)})^2 - A^{(n)}_s \to X^2_s-A_s$ in $L^1$ (definition of $\sigma^{(n)} \to \sigma$ for the second term, isometry for simple processes for the first term) and $(I_t^{(n)})^2 - A^{(n)}_t \to X^2_t-A_t$ in $L^1$ plus the fact that the conditional expectation is a contraction on $L^1$ shows that $X^2-A$ is a martingale. By uniqueness of the quadratic variation, we find that $A = \langle X \rangle$.