For example, when increasing $20$ by $75\%$, we add $75\%$ of $20$, to $20$. But, why when we increase $20$ by $200\%$, we don't add $200\%$ of $20$, to $20$? (Or, do we?)
Due to this inconsistency, it almost makes me feel like this procedure was invented synthetically, but agreed upon, nevertheless.
There is no inconsistency. If you want to add $200\%$ of $20$ to $20$, then you'd do it the same way that you'd add $75\%$ of $20$ to $20$.
For example, \[ 0.75\times 20+20=20\times 1.75=35 \] \[ 2.00\times 20+20=20\times 3.00=60 \]
It's the same arithmetic with just different numbers.