EDITED: To formulate into math framework:
I have a sampling generator producing IID gaussian. To highlight the convergence in the distribution, I calculate the following error.
Given a precision step m and a bandwitdh H, a theoritical density D, The sample (xn) have the following distribution error:
$$E(n,m)=|I_{n,m} - Jm|$$ $$I_{n,m}=1/m \sum_1^m f(ci). PI(ui,u_{i+1},n)$$ $$PI(ui,u_{i+1},n)= \sum_1^n 1_{x_k∈[ui,u_{i+1}[} $$ $$ci =( ui+u_{i+1} )/2$$ $$ui = H/m $$
$$Jm=1/m \sum_1^m f(ci). D(ci).dui$$
Does this error make sense ? Since Central Limit theorem only gives asymptotic error, would like to compute a more precise error.
After resesarch, this question relates to Kernel Density Smoothing and estimation from empirical distribution:
https://en.wikipedia.org/wiki/Kernel_density_estimation
KDE should converge to the theoritical when the number of sample increases...
How this convergence is done, is related to the random generator structure.
Another item is the consistency of Kernel Density estimator at asymptotics, when the sample becomes infinite.
This is here:
Kernel density estimation in the limit of infinity many samples
Additionnally, this question is related to the estimation of the sampling error in sampling the distribution.