If something increase $50$ to $200$, I know that it is $400\%$ increment using common sense.
I can get this using $\dfrac{200}{50}\times 100\% = 400\%$.
If something increase $50$ to $52$, I know that it is $4\%$ increment using common sense.
But if I apply the same logic, $\dfrac{52}{50}\times 100\% = 104\%$.
What is the problem in my logic?
Percentage increase is $$\frac{\text{new number - old number}}{\text{old number}}\times 100 \%$$
The right comptuation should be $$\frac{200-50}{50} \times 100 \%=300\%$$