Say I roll a six-sided dice 100,000 times.
I know that the resulting sums will be normally distributed around a mean of 3.5
How would one go about calculating the standard deviation of the distribution of those sums?
Say I roll a six-sided dice 100,000 times.
I know that the resulting sums will be normally distributed around a mean of 3.5
How would one go about calculating the standard deviation of the distribution of those sums?
Copyright © 2021 JogjaFile Inc.
The mean square for a single roll is $(1+4+9+16+25+36)/6=91/6$, so the variance is $91/6-(7/2)^2=35/12$. The variance of the sum of independent random variables is the sum of their variances, so the variance for the sum of $100,000$ rolls is $100000\cdot35/12$, and the standard deviation is the square root of that.