I have this question whose solution I don’t really understand or agree with.
Question:
According to a survey on part-time workers (their monthly salaries can vary by different months), the average base salary for women is higher than the average base salary for men. The average base salary for women is \$2800/month, and the average base salary for men is \$2500/month. Assume monthly salaries are normally distributed with a standard deviation of $300 for both men and women. Furthermore, assume salaries are independent across different months.
How much would a woman have to make in a year to have a higher salary than 99% of her male counterparts? (Please round your answer to 2 decimal places)
The suggested answer is solving $$InvNorm(0.99, 30000, 3600)$$
which gets the numerical value of $$38374.85 $$ in 2 decimal places.
I don’t quite understand how the solution is derived. My guess at what the solution was attempting is that all the values from the men’s normal distribution are multiplied by 12 to get the yearly units, which means
30000 is the average base monthly salary for men, 2500 multiplied by 12,
3600 is the standard deviation, 300, multiplied by 12,
and we try to find the value required to be higher than 99 percentile in this distribution.
However, if that is the case, why wouldn’t the standard deviation to be fitted into the same InvNorm value be $\sqrt{(12)(300^{2}})$ because after all, annual salary can be viewed as an addition of many monthly salary random variables rather than a scaling factor?
Let me know if more information is required. I welcome answers that give a top down explanation if my understanding of the approach, rather than just the difference in value of standard deviation, is wrong.