Suppose that $b, \sigma \in L^{\infty}(F)$ and $X_t:=\int_{0}^{t}b_sds+\int_{0}^{t}\sigma_sdB_s$. For $\pi:0=t_0<t_1<...<t_n=T$.
Denote $S_M(\pi):=\sum_{i=0}^{n-1}(\frac{X_{t_i}+X_{t_{i+1}}}{2})B_{t_i,t_{i+1}}$.
Show that $S_M(\pi)\rightarrow\int_{0}^{t}X_tdB_t+\frac{1}{2}\int_{0}^{t}\sigma_td_t$ in $L^2$, as $|\pi|\rightarrow 0$.
My idea: we know that $\sum_{i=0}^{n-1}X_{t_i}B_{t_i,t_{i+1}}\rightarrow \int_{0}^{t}X_tdB_t$ in $L^2$. So it suffices to show that
$$\sum_{i=0}^{n-1}(\frac{X_{t_{i+1}}-X_{t_i}}{2})B_{t_i,t_{i+1}}$$
converges to $\frac{1}{2}\int_{0}^{t}\sigma_td_t$ in $L^2$.
To do so, we need to simplify the expression
$$\mathbb{E}[||\sum_{i=0}^{n-1}(\frac{X_{t_{i+1}}-X_{t_i}}{2})B_{t_i,t_{i+1}}-\frac{1}{2}\int_{0}^{t}\sigma_td_t||^2]$$
Where I got stuck.