I have the following problem: I want to prove that the vector $(W(1_{[t_0,t_1]}),...,W(1_{[t_{n-1},t_n]}))$ is normally distributed with mean $0$ and covariance matrix $\Sigma=diag(t_1-t_0,...,t_n-t_{n-1})$, where $W(f):=\sum_{i\geq1}{\langle f,e_i\rangle Y_i}$ and $(e_i)_{i\geq1}$ is an ONB of $L^2[0,1]$ and $(Y_i)_{i\geq1}$ an I.i.d. Sequence of random variables centered with variance $1$.
I tryed using characteristic functions but I come to a point where I can't conclude anything.
Let $X^m:=(\sum_{i=1}^m{\langle 1_{[t_0,t_1]},e_i\rangle Y_i},...,\sum_{i=1}^m{\langle 1_{[t_{n-1},t_n]},e_i\rangle Y_i})$ so
\begin{align*} \Phi_{X^m} &=\mathbb{E}[\exp(i\langle s,X^m\rangle)] \\ &=\mathbb{E}[\exp(i\sum_{k=1}^mY_k\sum_{j=1}^n\langle 1_{[t_{j-1},t_j]},e_k\rangle s_j)]\\ &=\prod_{k=1}^m\mathbb{E}[\exp(iY_k\sum_{j=1}^n\langle 1_{[t_{j-1},t_j]},e_k\rangle s_j)] \\&=\exp \left(-\frac{1}{2} \left[\sum_{j=1}^n\langle 1_{[t_{j-1},t_j]},e_1\rangle s_j \right]^2 \right) \cdots \exp \left( -\frac{1}{2} \left[ \sum_{j=1}^n\langle 1_{[t_{j-1},t_j]},e_m\rangle s_j \right]^2 \right) \end{align*}
But now I have no idea how to conclude.. And I am also thinking that this is not the good way...
Set
$$f_n(s) := \sum_{j=1}^n s_j 1_{[t_{j-1},t_j]}(s).$$
Then your calculation shows
$$\Phi_{X^m}(s) = \exp \left(- \frac{1}{2} \sum_{k=1}^m \langle f_n, e_k \rangle^2 \right).$$
Letting $m \to \infty$, we obtain
$$\Phi_X(s) = \exp \left(- \frac{1}{2} \sum_{k \geq 1} \langle f_n, e_k \rangle^2 \right). \tag{1}$$
Since $(e_k)_{k \geq 1}$ is an ONB, we have
$$\sum_{k \geq 1} \langle f_n, e_k \rangle^2 = \|f_n\|^2$$
by Parseval's inequality. Combining
$$\|f_n\|^2 = \int_0^1 f_n(s)^2 \, ds = \sum_{j=1}^n s_j^2 (t_j-t_{j-1})$$
with $(1)$ yields
$$\Phi_X(s) = \exp \left(- \frac{1}{2} \sum_{j=1}^n s_j^2 (t_j-t_{j-1}) \right)$$
for all $s \in \mathbb{R}^n$. This shows that $X=(W(1_{[t_0,t_1]}),\ldots,W(1_{[t_{n-1},t_n]}))$ is Gaussian with mean vector $0$ and covariance matrix $\text{diag}(t_1-t_0,\ldots,t_n-t_{n-1})$.