I have a function f(x).
I do M runs and M tends to infinity. For each run, I generate two random samples from the domain of f and calculate the square of the deviation.
The image might help clarify.
What I have found is that the integral/sum over the uncorrelated, independent deviations adds up to 0. An explanation of why this is would be really helpful.
Thanks