Variance of Sum of 100 dice rolls?

1.5k Views Asked by At

If I roll 100 dice, I would expect the distribution of the sum to approach a normal distribution, right? Now, how can I calculate the variance and standard deviation of this distribution of the sum of 100 dice rolls. Here's what I'm thinking:

E[1 dice roll] = 3.5 // Variance[1 dice roll] = 2.91

Variances[100 dice rolls] = 100 * Variance[1 dice roll] = 291.

Standard deviation[100 dice rolls] = sqrt(291) = ~17

Is this correct? If it is correct, why is it that in this specific case I can simply add the variances?