How to prove that the sum of two random high dimensional vectors is close to $\sqrt2$?

129 Views Asked by At

We've noticed a detail in our program that we don't directly know how to prove.

Take two 'random' vectors x and y with a large dimension d. Both vectors are created by generating d random numbers between $-1/2$ and $1/2$ and afterwards normalizing the vector.

We've noticed that $\| x + y \| \approx \sqrt2 $. Can we prove this easily?