I'm trying to prove that $$\left<X\right>_t=\int_0^t G_s^2ds$$ when $$X_t=\underbrace{\int_0^t F_sds}_{=:A_t}+\underbrace{\int_0^t G_sdW_s}_{=:M_t},$$ where $$\left<X\right>_t:=\lim_{n\to \infty }\sum_{i=1}^n(X_i-X_{i-1})^2,$$ in some sense ($L^2$ or a.s. depending in which sense it exist).
I denote $X_i:=X_{\frac{i}{n}}$.
\begin{align*} \sum_{i=1}^n(X_{i}-X_{i-1})^2&=\sum_{i=1}^n(A_{i}-A_{i-1})^2+2\sum_{i=1}^n (A_i-A_{i-1})(M_{i}-M_{i-1})+\sum_{i=1}^n(M_{i}-M_{i-1})^2. \end{align*} Now $$\sum_{i=1}^n(A_{i}-A_{i-1})^2\underset{n\to \infty }{\longrightarrow }0\quad a.s.$$
$$\sum_{i=1}^n(M_{i}-M_{i-1})^2\underset{n\to \infty }{\overset{L^2}{\longrightarrow} }\int_0^t G_s^2ds.$$
Finally $$\sum_{i=1}^n(A_i-A_{i-1})(M_i-M_{i-1})\leq\sqrt{\sum_{i=1}^n(A_i-A_{i-1})^2}\sqrt{\sum_{i=1}^n(M_i-M_{i-1})^2}.$$
Since one sum converges a.s. to 0 and the other one converges to $\int_0^t G_s^2ds$ in $L^2$ I can't say anything a priori. Indeed, a priori neither $A_t$ converges in $0$ in $L^2$ nor $M_t\to \int_0^t G_s^2ds$ in which sense will converges $\sum_{i=1}^n(X_i -X_{i-1})^2$ will converges ? The best thing I can say is that there is a subsequence $$\sum_{i=1}^{n_k}(X_{i}-X_{i-1})^2\underset{k\to \infty }{\longrightarrow }\int_0^t G_s^2ds\quad \text{a.s.}.$$
So, how can I conclude ? Maybe $\sum_{i=1}^n(A_i-A_{i-1})^2\to 0$ in $L^2$ ? I tried to prove that, but I couldn't conclude (unless if $(A_i)$ is uniformly bounded by a $L^2$ r.v.)
Note that $$\sum_{i=1}^n|A_{t_i}-A_{t_{i-1}}||M_{t_i}-M_{t_{i-1}}|\leq \sup_{|s-t| \leq |\Pi|, s,t \in [0,T]} |M_s-M_t| \int_0^T |F(s)| \,ds.$$
By continuity of the Ito integral this converges to $0$ as the mesh $\to 0$.
Edit:
The convergence of $\sum_{i}(X_i-X_{i-1})^2$ has to be understood as convergence in probability. See Remark I.V.$1.19$ from Revuz & Yor.