Having a sample of observations, $x_i, i=1,\ldots,n$, to test the null hypothesis stating that the data follows a specific distribution $F_0 (x)$, the Cramér-von Mises goodness-of-fit test statistic is:
\begin{equation}\label{e1}
\omega^2_n=n\int_{-\infty}^{+\infty} \left ( {\hat {F}_n(x)} - F_0 (x)\right )^2 \,dF_0(x),
\end{equation}
where $\hat {F}_n(.)$ is a step function, the so-called empirical distribution function, where at an arbitrary point $t\in (-\infty,+\infty)$, ${\hat {F}_n}(t)$ is defined as the proportion of observed sample less or equal than $t$. In other words, if we denote the observed sample, in increasing order as $x_{1:n}, x_{2:n}, \ldots, x_{n:n}$ then for arbitrary $t \in (-\infty,+\infty)$:
\begin{equation} {\hat {F}_n}(t) = \begin{cases} 0 & t < 0\\ \frac{i}{n} & x_{i:n} \leq t < x_{(i+1):n}, \quad {\text for}\,\ i=1,\ldots, (n-1),\\ 1 & t \geq x_{n:n}.\\ \end{cases} \end{equation} In addition $F_0 (x)$, the so-called the theoretical distribution function, is a non-decreasing function, with these properties:
\begin{align*} & 0 \leq F_0 (x) \leq 1, \quad \forall x \in (-\infty,+\infty),\\ &\int_{-\infty}^{+\infty} F_0 (x) \,dx =1. \end{align*}
In statistical literature, the quantity $\omega_n ^{2}$ is calculated as follows:
\begin{equation}\label{e3}
\hat{\omega}_n ^{2}=\frac {1}{12n}+\sum _{i=1}^{n}\left[{\frac {2i-1}{2n}}- F_0(x_{i:n})\right]^{2},
\end{equation}
On the other hand, in some literature the statistic \cite{Cramer1928}, known as $L_2-$ distance,
\begin{equation}\label{e4}
\int_{-\infty}^{+\infty} \left ( {\hat {F}_n(x)} - F_0(x)\right)^2 \,dx,
\end{equation}
has been used as a goodness-of-fit criteria to test the above mention null hypothesis, which measures the discrepancy between the empirical and theoretical distribution functions.
My question now is how to calculate (using a program probably) $L_2-$ distance, defined above.
For being specific, consider the following sample of size $n=30$, with mean equal to ${\bar x}=0.591$. Let $F_0(x)$ has the form (the so-called exponentially distribution): \begin{equation}\label{e5} {F}_0 (x) = \begin{cases} 0 & x<0\\ 1- \exp(\lambda x) & 0<x<+\infty,\\ \end{cases} \end{equation} where $\lambda =\frac{1}{{\bar x}}=\frac{n}{\sum_{i=1}^{n} x_i}=1.69$.
Figure shows the empirical and the theoretical distribution functions, ${\hat {F}_n}(.)$ and ${F}_0 (.)$, for which we are interested to calculate $L_2-$ distance.
[1] 0.221 0.264 0.404 0.009 0.228 0.012 3.201 0.629 0.481 0.909
[11] 0.113 0.223 0.626 0.199 0.044 0.861 2.692 0.944 0.182 0.587
[21] 0.570 0.214 0.729 0.282 0.615 0.366 0.489 1.401 0.126 0.109
(If I find the answer for this special case, I could then generalize it to other distributions like lognormal, Pareto, and even more complicated distributions).