why does commutativity of addition fail for infinite sums?

7k Views Asked by At

While discussing the sum of a particular series, $\sum\limits_{n=0}^{\infty}{\left(-1\right)}^n$ (a sum that I've heard is alleged to be equal to $\frac{1}{2}$), it was mentioned to me that addition is not necessarily commutative when you are adding an infinite number of terms, and that the so-called proof of the aforementioned allegation is flawed. If this is correct, can somebody please clarify or explain to me if, why, or how the commutative property of addition can fail under such circumstances?

[edit] Thank you all for the responses so far, but the primary purpose of my question was to understand WHY the commutative property would allegedly not apply to infinite series such as the one I mentioned above. I'm not asking for why it should be equal to $\frac{1}{2}$ or why it should not be... I'm wanting to understand what's wrong with applying the commutative property of addition to make it more convenient to compute with such a series, because if nothing is wrong with applying the commutative property in such a case, then it seems to follow that $\sum\limits_{n=0}^{\infty}{\left(-1\right)}^n$ IS equal to $\frac{1}{2}$... And if really is not equal to $\frac{1}{2}$, then there must be some underlying reason why commutative property of addition doesn't apply. I am asking what that reason is.

5

There are 5 best solutions below

4
On BEST ANSWER

I know it sounds like I'm avoiding the real question, but here we go:

Q: "WHY the commutative property would allegedly not apply to infinite series such as the one I mentioned above"

A: Why would it?

This is one way to look at things, which I find intuitive but some may not:

The commutative property of addition is that $a+b=b+a$. Technically, it applies only to sums of two numbers! Using the associativity of sum, we may apply this repeatedly to rearrange a finite sum using a finite number of steps (where between each $=$ sign we have swapped only two adjacent summands). Even if the sum is $\sum_{k=0}^n a_k$ with a finite $n$, we know that whatever $n$ is, we would be able to rearrange the terms using the commutativity rule, and thus we may rearrange the terms in $\sum_{k=0}^n a_k$ arbitrarily even if we don't know $n$.

However, rearranging infinite number of terms in e.g. $\sum_{k=0}^\infty \frac{(-1)^k}{k}$ cannot be done by applying the commutativity rule repeatedly! I think @josh314's answer explains infinte sums very nicely, so I won't repeat it here.

An example of why we are generally not allowed to just apply any rule an infinite number of times: Let's use the rule "if $S\in \mathbb{N}$ is a finite set, $S \cup \{\max S + 1 \}$ is a finite set". Now apply the rule to $\{0\}$ repeatedly, we get $\{0,1\}$, $\{0,1,2\}$, $\dots$, and every one of them indeed is a finite set. However, if we were to apply the rule "an infinite number of times", we would end up concluding that $\mathbb{N}$ is a finite set.

1
On

There is a famous theorem of Riemann that says that for a conditionally convergent series, we can re-order the terms so that the series converges to any limit or even diverges.

A series $a_1 + a_2 + a_3 +\cdots$ is said to converge absolutely if the series $|a_1|+|a_2|+|a_3|+\cdots$ converges. If a convergent series does not converge absolutely then it converges conditionally.

For an absolutely convergent series you can reorder the terms however you like and you will always get the same limit. It is only for conditionally convergent series that the order matters.

0
On

To complete the answer of Fly by Night, the following example is enlightening: Consider the sum $1-\frac{1}{2}+\frac{1}{3}-\frac{1}{4}+\dots$. In this order, we can prove why it converges to $ln(2)$ (it's not important how). The important thing is that both the sum of the positive terms and the sum of the negative terms diverge (because the serie is not absolutely convergent). So if you aim at any real $a$, you can put positive terms until you go above $a$ (you will always go above because the sum of positive terms diverge), then negative terms until you go below $a$, then back to positive terms, and so on. The terms become smaller and smaller, so your distance to $a$ will diminish, and the reordering of the sum will finally converge to $a$. you can also go to $\infty$ by going above $1$, then one negative term, above $2$, one negative term, and so on... So depending on how you order the terms, you can go to any finite limit, or to $\infty$ (or $-\infty$).

0
On

The commutative property of addition applies directly to finite sums only. An infinite sum is a limit of a sequence of finite sums. Specifically, you should interpret $\sum_{n=0}^\infty a_n$ as $\lim_{n\to\infty} s_n$ where $s_n = \sum_{k=0}^n a_k$. If you only re-arranged the first few $a_k$, say for $k \le N$, the partial sums for $n<N$ would change but the $s_n$ for $n\ge N$ would remain the same, due to the commutative property of addition. So the limit of the sequence $\{s_n\}$ would remain the same, therefore the infinite sum is unchanged. Where things get sticky is when a re-arrangement of the $a_k$ is done that is not limited to $k$ bounded (for example, if you have an alternating series and want to add up all the positive terms first). This changes the $s_n$ for arbitrarily large $n$ and so can modify the limit as $n\to\infty$. As other answers and comments have pointed out, there are conditions where the limit is unchanged even under re-arrangements of terms of unbounded $k$, but that is not the general case.

1
On

Actually, commutivity does hold for infinite sums. Glossing over a few technicalities, commutivity says that you get the same answer whenever you interchange any two terms in a sum. As such, this holds for infinite sums as well as finite:

$$(\dots+a+\dots+b+\cdots)=(\dots+b+\dots+a+\cdots)$$

(That is, the two series converge or diverge together, and if they converge, they do so to the same limit.) By induction on $k$, you can show that two sums are equal whenever any $k$ terms appear in any order. But the inductive proof only says what happens for whole numbers $k$. It doesn't have anything to say when you start interchanging infinitely many terms. And as other answers have pointed out, if a series is conditionally but not absolutely convergent, then rearranging infinitely many terms can produce anything under the sun.