I've used Math.random()*2-1+Math.random()*2-1+Math.random()*2-1 many times in the past to get normally-distributed random numbers with a standard deviation of 1. Of course, it's a bit of an approximate, but it works, and I usually don't want numbers outside of the third standard deviation anyway. And I sort of understand why this works out the way it does (as in it makes sense in my head why numbers further from zero are much less likely), but I have no idea how to prove it legitimately.
Why is it that adding randoms in the range of [-1,1] three times specifically yields a standard deviation of 1? I did some tests and found adding four times gives a std. dev. of about 1.155, and adding twice gives around 0.813. Why is this?
I want to understand the real math behind this.
This is a combination of the central limit theorem (as @AlfonsoFernandez points out) and the fact that the sum of three such uniform distributions has a mean of 0 and a variance of 1 (shown below).
Let $X_i\sim U(0,1)$ then $1-2X_i\sim U(-1,1)$.
Let $Y=(1-2X_1)+(1-2X_2)+(1-2X_3)$
By the CLT, as we add together $X_i$'s the distribution approaches a normal. Thus, $Y$ is approximately normal. Further:
$E(X_i)=0$; $Var(X_i)=\frac{1}{12}$
$E(1-2X_i)=0$; $Var(1-2X_i)=\frac{1}{3}$
Now we have:
$E(Y)=E(1-2X_1)+(1-2X_2)+(1-2X_3))=0$
And, since the variables are independent:
$Var(Y)=Var(1-2X_1+1-2X_2+1-2X_3)$
$=Var(1-2X_1)+Var(1-2X_2)+Var(1-2X_3)=1$
Thus, we have that $E(Y)=0$ and $Var(Y)=1$ and $Y$ is approximately normal.