I'm math amateur. I was just given a raise at my job and my salary grew from $\$1,200$ a month to $\$1,300$ a month.
I'm trying to figure out percentage growth in my salary.
Did it grow $\frac{\$100}{\$1,200} = 8.3\%$ or did it grow $\frac{\$100}{\$1,300} = 7.7\%$
What's the general rule for calculating percent growth when old value goes to next value?
I just can't figure out (I've been thinking about this for 30 minutes).
The increase is $100/1200\approx 8.3\%$. If your salary is decreased to $\$1200/\text{mo}$ it will be a decrease of $100/1300 \approx 7.7\%$ The fact that these two percentages are not the same is not strange. If the raise and decrease are small, the percentages are close. It gets clearer if the numbers get large. Suppose your salary were doubled. That would be a $100\%$ increase. Then if it were reduced to the previous value it would be a $50\%$ decrease. You always divide by the old number.