Let's say that I have 50 coins, total.
I give 20 coins to my friend; I'm left with 30.
Then I give 15 more; I'm left with 15.
I give 9 more; I'm left with 6.
I give 6; I'm left with 0.
When I sum-up what I have left each time, I get $$0+6+15+30 = 51$$ But I only had 50 coins.
When I sum-up what I give every time, I get $$6+9+15+20 = 50$$
The sums both are different? Why are they different?
There's absolutely no reason for the sums of the "leftover amounts" to be equal to what you started. For a simple example of this, consider what happens if each day you give away one coin. Then your "leftover amounts" are $49,48,47,46,...$ and their sum is a bit more than $50$ (precisely, it's $1225$). Conversely, if you give away everything at once, you only have one leftover amount and it's a bit less than $50$ (precisely, it's $0$).
The point is that in order to expect the sum of the "leftover amounts" and the original number of coins to be the same, you'd need to expect that each coin is left over at exactly one moment in time (so each coin gets represented exactly once in the sum of the "leftover amounts"). But that's obviously nonsense in general.
On the other hand, each coin you start with is eventually taken away, and you can never take away the same coin twice - so that's why the sum of the "takeaway amounts" gives the number of coins at the start (each coin is counted exactly once in exactly one of the "takeaway amounts").
Basically, before you expect to add up a bunch of numbers and get a particular result, you need to have some reason to believe that that collection of numbers and that expected result are related. And you don't have one here.