This isn't so much a question about getting a right answer as much as it's about understanding a mathematical concept, but I will give you the problem that spawned it:
An analysis of data shows that the annual income of a randomly chosen individual from country A has mean \$18000 and standard deviation of \$6000. And the annual income of a randomly chosen individual from country B has a mean $31000 an standard deviation of 8000. Find the approximate probability that the average income from B is at least 15000 larger than A.
I know how to get the answer to this question so, never mind that. My question comes into play when getting the variance for the normalizations of A and B.
For instance, why is the variance for A calculated $(1/100)^2$ times the summation of 100 times the variances? Likewise for B. The $1/100^2$ is throwing me off. Why are we multiplying by $1/100^2$?
If $X$ is a random variable, the variance of $100X$ is $100^2$ times the variance of $X$. Likewise the variance of $X/100$ is $1/100^2$ times the variance of $X$.
I don't know what that has to do with your problem, but that is a common way for the square of a number to show up as a factor when computing a variance.