$B_t$ is a standard Brownian motion, $f(t)$ is a continuous function on $[0,1]$.
$Y=\int_0^1f(s)B_sds$.
How to show $Y$ is normal. And what is the variance?
I know I can use characteristic function to do this. But now I wander another way by using the fact $$(\int_0^1h(s)ds)^2 =\int_0^1\int_0^1h(s)h(t)dsdt$$ for bounded $h(s)$. I try to simplify this by take expectation within the integral.The result is $$\int_0^1\int_0^1f(s)f(t)min(s,t)dsdt.$$ However I cannot get further results.
For $n \in \mathbb{N}$ we set $t_j^n := t_j := \frac{j}{n}$. From the definition of the Riemann integral, we know that
$$Y_n := \frac{1}{n} \sum_{j=1}^n f(t_j) \cdot B(t_j) \stackrel{n \to \infty}{\to} \int_0^1 f(s) B(s) \,ds =:Y$$
almost surely. From the uniform continuity of $f$, it is therefore not difficult to see that
$$\tilde{Y}_n := \sum_{j=1}^n B(t_j) \cdot (\varphi(t_j)-\varphi(t_{j-1})) \stackrel{n \to \infty}{\to} Y$$
where
$$\varphi(t) := \int_0^t f(s) \, ds.$$
As $(B_t)_{t \geq 0}$ is a Brownian motion, hence a Gaussian process, the random vector $(B(t_1),\ldots,B(t_n))$ is Gaussian. This implies that $\tilde{Y}_n$ is a Gaussian since it is a linear combination of (jointly) Gaussian random variables. Therefore, it suffices to show that the sequences $\mathbb{E}\tilde{Y}_n$ and $\text{var} \, \tilde{Y}_n$ are convergent. To this end, we note that
$$\mathbb{E}\tilde{Y}_n = \sum_{j=1}^n \mathbb{E}(B_{t_j}) (\varphi(t_j)-\varphi(t_{j-1})) = 0$$
as $\mathbb{E}B_t =0$ for any $t \geq 0$. In order to calculate the variance, we rewrite $\tilde{Y}_n$ as follows:
$$\tilde{Y}_n = -\sum_{j=1}^n \varphi(t_{j-1}) \cdot (B_{t_j}-B_{t_{j-1}})+\varphi(1) B_1=: S_n+\varphi(1) B_1.$$
This implies
$$\begin{align*} \text{var} \, \tilde{Y}_n &= \mathbb{E}(Y_n^2) = \mathbb{E}(S_n^2)+ 2\varphi(1) \mathbb{E}\big(B_1 \cdot S_n\big)+\varphi(1)^2 \mathbb{E}(B_1^2) \\ &= \underbrace{\sum_{j=1}^n \varphi(t_{j-1})^2 (t_j-t_{j-1})}_{\to \int_0^1 \varphi(s)^2 \, ds} -2 \varphi(1) \underbrace{\sum_{j=1}^n \varphi(t_{j-1})(t_j-t_{j-1})}_{\to \int_0^1 \varphi(s) \, ds} + \varphi(1)^2 \end{align*}$$
Finally, we conclude that $Y$ is a Gaussian random variable with mean $0$ and variance
$$\begin{align*} \sigma^2 &:= \int_0^1 \varphi(s)^2 \, ds - 2\varphi(1) \int_0^1 \varphi(s) \, ds + \varphi(1)^2 \\ &= \int_0^1 \left| \int_0^s f(r) \, dr \right|^2 \, ds - 2\varphi(1) \int_0^1 \int_0^s f(r) \, dr \, ds + \varphi(1)^2\end{align*}$$
for $\varphi(1) = \int_0^1 f(s) \, ds$.
Remarks
The given integral is a special case of the so-called Wiener-Paley-Ziegmund integral. These integrals are of the form $$\int_0^1 B(s) d\varphi(s)$$ where $\varphi$ is a function of bounded variation. Here, we have $$\varphi(t) = \int_0^t f(s) \, ds$$ which is indeed a function of bounded variation.
Instead of calculating the variance using the random variables $\tilde{Y}_n$ we can also use $Y_n$. Then, we see that the variance equals $$\int_0^1 \int_0^1 f(s) f(t) \min\{s,t\} \, ds \, dt.$$