I'm experimenting with Gaussian blur and was a bit surprised by the following:
Let $A\in\mathbf{R}^{m\times n}$ be a matrix with all entries normally (and independently) distributed. Then create new values by taking the sum $(\sum_iA_{i,j(i)})/m$ where $j$ is a random string with $0\le j(i)<n$.
I assumed those new points to be almost normally distributed for large $m$, but it turns out that, while the distribution looks good from an affine point of view, there are very visible shifts between multiple runs (here $m=100$, $n=5$, note that each cluster corresponds to a fixed $A$ where only the random string varies between the points):

So it seems that the variance of the mean of $k$ points (corresponding to some fixed $A$) is much larger than the variance of the mean of $k$ normally distributed points. Any ideas how I could generate $A$ to keep the former mean as small as possible?
Let each element $A_{i,j}$ of $A$ have $\mathcal{N}(\mu_{i,j}, \sigma_{i,j})$ distribution (normal with mean $\mu_{i,j}$ and variance $\sigma^2_{i,j}$). As they are independent $\sum_{i} A_{i, j(i)}$ has normal distribution with mean $\sum_{i} \mu_{i, j(i)}$ and variance $\sum_{i} \sigma^2_{i, j(i)}$ and $\frac{1}{m}\sum_{i} A_{i, j(i)}$ has normal distribution with mean $\frac{1}{m} \sum_{i} \mu_{i, j(i)}$ and variance $\frac{1}{m^2}\sum_{i} \sigma^2_{i, j(i)}$.
All your samples look pretty normal, and there is an explanation of the reason they have different means and variances.