Why there is no pattern between a fractional growth rate and a unit-multiplied growth rate?

185 Views Asked by At

For example, when increasing $20$ by $75\%$, we add $75\%$ of $20$, to $20$. But, why when we increase $20$ by $200\%$, we don't add $200\%$ of $20$, to $20$? (Or, do we?)

Due to this inconsistency, it almost makes me feel like this procedure was invented synthetically, but agreed upon, nevertheless.

2

There are 2 best solutions below

0
On

There is no inconsistency. If you want to add $200\%$ of $20$ to $20$, then you'd do it the same way that you'd add $75\%$ of $20$ to $20$.

For example, \[ 0.75\times 20+20=20\times 1.75=35 \] \[ 2.00\times 20+20=20\times 3.00=60 \]

It's the same arithmetic with just different numbers.

0
On

Like everyone said, there is no inconsistency. However, method of calculation do differ.

You are asking about increasing $20$ by its $200\%$. A $200\%$ of any number, means you want to double up the value, because $100\%$ means the number itself.

So, $200\%$ of $y$ means $2y$, and then you are increasing $y$ by $200\%$, it becomes $y+2y$ eventually.

A $75\%$ increase in 20 will mean, multiplying $20$ by a factor of $1.75$. Why? Because it will (indirectly) mean $20 + 0.75\times 20=20(1+0.75)=1.75\times 20$

I am not sure if there is a name to this process, but that's a very good way to deal with $\%ages$.