Basically, what is the difference between $1000\times1.03$ and $1000/.97$?
For some reason I feel like both should result in the same number. I only ask because I'm working a problem with a percentage of waste added in. For $3\%$ waste, I would think that you could multiply the amount by $1.03$ to add $3\%$. However, my professor divided by $.97$.
Just multiply
$$1.03 \times 0.97=0.9991$$ to see the difference.
Something that increases 3% and then decreases 3%... has slightly decreased. This is because the decrease of 3% is not measured compared to the original value, but to the increased value.