This is a kind of abstract question regarding the mechanisms and logic of mathematics.
First, let me try to explain what I tried to convey with the topic title.
Let's say I have a value that gets decreased to a lower one, and I want to calculate the percentage difference, like for example 13 goes down to 8.
13 - 8 = 5
So now I would divide the the difference of 5 to the original value of 13, which is what the topic is about.
5 / 13 = 0.3846
And then of course I'd multiply the 0.3846 by 100 to get the proper percentage difference between 13 and 8.
0.3846 * 100 = 38.46
At which point I know the percentage difference is 38.46.
But the part that really I don't understand, is that there must be a logical reason for why it makes sense to divide the difference of 5 to the original value of 13. I can understand we do it because it works, but I don't understand why exactly it works.
I hope this question makes sense, basically I'm trying to say that on an intuitive level or a logical reasoning level, I can't seem to understand why the difference is split to the original value, other than "it works because reasons".
It's kind of like a measure of the closeness of the one value to another that can be normalized over different ranges. Imagine we are only dealing with positive numbers for now.
Let's say we have a value and we want a measure of how close another number is to it. Let's say we decide that a reasonable way to measure how close two numbers are is to just look at their difference. Say you have $10$ and $5$; then by our measure of closeness, they are $5$ units close. Now imagine you have $1000000000$ and $999999995$; these two by our measure of closeness are $5$ units close. But would you say that is a fair assessment?
Sometimes it is useful to be able to compare how close two values are for different ranging values and this is where the percentage difference comes in. You take the difference and divide it by the original number to weigh the fact that a difference of 5 between small ranging numbers like $10$ and $5$ has a much "bigger" effect than a difference of 5 between two very large values like $1000000000$ and $999999995$. So the percentage difference between $10$ and $5$ is $50\%$ and the percentage difference between $1000000000$ and $999999995$ is $0.0000005\%$ reflecting this.
A sort of analogy for this could be this: for a poor man who's new worth is 10\$, if you take 5\$ away from him, it will affect him much more than if you took 5\$ away from a billionaire. This is because the percentage difference you are taking away from the two is hugely different.
This is why you divide the difference by the original value.