I'm trying to calculate the following integral: $$\int_0^t W_tdW_t$$ Without using Ito’ Lemma. I am confused about how $dW^2=dt$ when Ito's Lemma is not used.
Hint
Let W be a standard Wiener process and t an arbitrary positive real number. For each $n$, and $t_i=it2^{-n}$, then:
- show that $\sum_i(\Delta W(t_i))^2$ converges to $t$ as $n$ grow.
- show that the terms in the sum are IID and that their variance shrinks sufficiently fast as $n$ grows and use the fourth moment of a Gaussian distribution.
So, I am left with the following:
$$E(\sum_i(\Delta W(t_i))^2)=E(\sum_i(W(t_{i+1})-W(t_{i}))^2)=?$$
What I have done
So, can I say since $(W(t_{i+1})-W(t_{i}))$ follow $\sqrt {t_{i+1}-t_{i}}N(0,1)$ and by strong law of Large number $\sum_i(W(t_{i+1})-W(t_{i}))^2$ converges to $\sum_iE(W(t_{i+1})-W(t_{i}))^2$=1/2 $\sum_i(t_{i+1}-t_{i})$=t?
Is this correct?
Recall Riemann sum approximation is:
$$X_t^m = \sum_{t_j<t}f_{t_j}\Delta W_j$$
where $\Delta t = 2^{-m}$ and $W_t$ is a standard Brownian motion, and
$$\Delta W_j = W_{t_{j+1}}-W_{t_j}$$
The pathwise convergence is for every Brownian motion and the Riemann approx. converges to the limit such that the limit is measurable for the filtration up till $t$ because $X_t$ is a continuous function of $W_{[0,t]}$. Here I will skip a lot of explanation on Reimann sum approx. (and will take $\Delta W_j$ as Gaussian with mean $\Delta t$ and variance $2\Delta t^2$ (essentially above hint) see this) and go to the problem at hand which is:
$$X_t=\int_{0}^t W_s dW_s$$
By Riemann sum approx. we write
$$X_t^m=\sum_{t_j<t}W_{t_j}(W_{t_{j+1}}-W_{t_{j}})$$
Noticing that:
$$W_{t_j}=\frac{1}{2}(W_{t_{j+1}}+W_{t_j})-\frac{1}{2}(W_{t_{j+1}}-W_{t_j})$$
So we can write:
$$X_t^m=\frac{1}{2}\underbrace{\sum_{t_j<t}W_{t_j}(W_{t_{j+1}}+W_{t_j})(W_{t_{j+1}}-W_{t_j})}_{\text{first sum}}-\frac{1}{2}\underbrace{\sum_{t_j<t}W_{t_j}(W_{t_{j+1}}-W_{t_j})(W_{t_{j+1}}-W_{t_j})}_{\text{second sum}}$$
Notice, $$(W_{t_{j+1}}+W_{t_j})(W_{t_{j+1}}-W_{t_j})=W^2_{t_{j+1}}-W_{t_j}^2$$
By telescoping sum and by letting $t_n=\max\{t_j|t_j<t\}$ the first sum term becomes $\frac{1}{2}(W^2_{t_{n+1}}-W_0^2)$. Also, because $W_0=0$, we get $\frac{1}{2}W^2_{t_{n+1}}$ and $W_{t_{n+1}}\rightarrow W_t$ as $\Delta t \rightarrow 0$. Now the second sum term is $\sum_{t_j<t}\Delta W_j^2=S$ and since $E(\Delta W_j^2)=\Delta t$, so
This answer's the ? in the above question
$t_n\rightarrow t$ as $\Delta t\rightarrow 0$. Similarly for variance
$$var(S)=2\Delta t \sum_{t_j<t}\Delta t=2\Delta t t_n \leq 2 t 2^{-m}$$
Thus,
This is the problem statement
Notice, if $W_t$ was differentiable we would obtain $\frac{1}{2}W_t^2$ instead.