I would love if someone were to clear up my confusion. Who is right and what are these things telling me?
I'm trying to show the percent change in some software performance times here at work, and I'm having trouble figuring out the math. As I Google "percent change", I find several sources who confuse this with "percent difference", and I find 3 different equations to calculate it. I have a stats major friend trying to help me and either we're not communicating and they aren't understanding what I'm asking for, I'm wrong/stupid and showing it/crazy, or they've been the recipient of a poor education.
So most sources, including the stats major, tell me percent change is $\frac{old - new}{old} * 100$
Ok, let's play with some numbers. I use Google as my calculator. My numbers are going to be 200k as the old value and 4 as the new value. The units are in clock ticks, if anyone cares, but I don't think it should matter.
According to Google, $\frac{200000 - 4}{200000} * 100 = 99.998\%$ faster.
What this is telling me that increasing $4$ by $99.998\%$, $99.998\% * 4 = 200000$
How am I interpreting this entirely wrong? What does this percentage mean? How do I change $4$ by $99.998\%$ and get to the old value? I can't believe that this is only a $\approx 100\%$ change...
When you raise a value by some percentage and then lower the result by the same percentage, you don't get back where you started. When the percentages are small, you almost come back. Let's use smaller numbers for an example. If you start with $200$ and reduce it $50\%$, you get $100$. If you raise that by $50\%$ you get $100 \cdot 1.5=150$ and you are down $50$ at the end. The general case is if you increase by $x$, you multiply by $1+x$. If you decrease by $x$ you multiply by $1-x$ The product of these if $1-x^2$, so you will always end up lower. If $x$ is small, $x^2$ is even smaller, so might be negligible.