Calculating the standard deviation of the sum of a series of randomly distributed variables

516 Views Asked by At

Say I roll a six-sided dice 100,000 times.

I know that the resulting sums will be normally distributed around a mean of 3.5

How would one go about calculating the standard deviation of the distribution of those sums?

1

There are 1 best solutions below

2
On

The mean square for a single roll is $(1+4+9+16+25+36)/6=91/6$, so the variance is $91/6-(7/2)^2=35/12$. The variance of the sum of independent random variables is the sum of their variances, so the variance for the sum of $100,000$ rolls is $100000\cdot35/12$, and the standard deviation is the square root of that.