Let $X(t)$ be a stationary Gaussian process with mean $\mu$, variance $\sigma^2$ and stationary correlation function $\rho(t_1-t_2)$. If $X(t)$ is a white noise process the correlation function is given by the Dirac delta function $\rho(t_1-t_2) = \delta(t_1-t_2)$.
The integral of this process is given by:
$$I = \int_0^L X(t) \, dt$$
According to this CrossValidated post the variance of $I$ is given by:
$$\text{Var}[I] = L\sigma^2$$
However this does not agree with the results I obtained through simulation. The approach is to discretise the white noise Gaussian process into $N$ independent normal variables. The integral can then be approximated through:
$$ I = \int_0^L X(t) \approx \frac{L}{N}\sum_{i=1}^NX_i$$
Where $X_i$ are indepedent random variables $X_i \sim \mathcal{N}(\mu,\sigma^2)$. In simulation I find that as $N$ grows large then $\text{Var}[I] \rightarrow 0$. Why does it not approach $L\sigma^2$? What is the problem with my approximation?
The disparity arises from the fact that your discretization of the continuous process does not assign the appropriate variance to the $X_i$. Here's the key (for heuristics, see here, Section 3.2):
(The power spectral density of the i.i.d. sequence approaches that of the continuous process that it simulates -- flat and equal to the same constant value $\sigma^2$ -- except that for the i.i.d. sequence it is "band limited", i.e. vanishing outside of a finite-width interval.)
Thus, in the present case, to simulate $$I = \int_0^L X(t) \, dt \approx \sum_{i=1}^N X(i\Delta)\,\Delta $$ where $\Delta=\frac{L}{N}$, one would use $$\hat{I} = \Delta\sum_{i=1}^N X_i $$ with $X_1,X_2,\ldots$ i.i.d. $\text{Gaussian}(\text{mean}=\mu,\text{variance}=\frac{\sigma^2}{\Delta})$. Then $$\begin{align} \text{Var}[\hat{I}] = \Delta^2\,N\,\frac{\sigma^2}{\Delta} = (N\Delta)\sigma^2 = L\sigma^2. \end{align} $$