I'm attemting to calculate $$\sum_{n=1}^{10000} \frac{1}{n^2}$$ numerically, using a computer program. However, if I sum the terms in the natural order (starting from $n=1$ and going up one unit every time) I get a different result that if I compute the sum in the reversed order (starting from $n=10000$ and going down one unit at a time). I understand it must have something to do with the rounding the computer does in every single calculation but I can't seem to explain the phenomenon in a clear manner. Any help? Thank you in advance.
2026-02-22 21:33:55.1771796035
Non-commutative sum?
527 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
Perhaps a simpler example may help. Let's say you have a computer that does decimal floating-point arithmetic with only $3$ digits, and you want to add $999 + 1 + 1 + \ldots 1$ (with $100$ 1's).
If you do it in that order, the first addition gives you $999 + 1 = 1000 = 1.00 \times 10^3$, and each subsequent addition leaves it there ($1000 + 1 = 1001$, but rounded to $3$ digits that's still $1.00 \times 10^3$).
On the other hand, try it in the reverse order. $1 + 1 = 2$, $2 + 1 = 3$, \ldots, $99 + 1 = 100$, and finally $100 + 999 = 1099 = 1.10 \times 10^3$.