Confusion with proof of Cauchy criterion for infinite sums

46 Views Asked by At

I'm learning about the Cauchy criterion for the convergence of series. I am getting tripped up by a specific part of the proof, though. I'll list the part I'm confused by first and then paste the whole proof below that for reference.


$|\Sigma_{(i\in I_0 \cup J)}{a_i} - C| < \frac{\epsilon}{2}$

$|\Sigma_{(i\in I_0)}{a_i} - C| < \frac{\epsilon}{2}$

By subtracting these two inequalities and remembering that since $I_0$ and $J$ are disjoint

$\Sigma_{(i\in I_0 \cup J)}{a_i}= \Sigma_{(i\in I_0)}{a_i} + \Sigma_{(i\in J)}{a_i}$,

we have $|\Sigma_{(i\in J)}{a_i}| < \epsilon$


I am confused by how subtracting these two inequalities gives us this result. Does subtracting them both not give us the following?

$|\Sigma_{(i\in J)}{a_i}| < 0$

If I'm correct, when we subtract the two inequalities, since our order of subtraction does not matter, we can remove the absolute value bars around the individual terms and expand them to the entire LHS expression, which means that our $C$'s cancel, as do our summations over $I_0$. But since this is subtraction, the two $\frac{\epsilon}{2}$ terms cancel as well.

What am I missing here?

Here's the whole proof, for reference. Only posting the forward direction since that's what I'm confused on.


Want to prove: A necessary and sufficient condition that the sum $\Sigma_{(i \in I)}{a_i}$ converges is that $\forall \epsilon > 0$, there is a finite set $I_0$ such that

$|\Sigma_{(i\in J)}{a_i}| < \epsilon$

for every finite set $J \subset I$ \ $I_0$

Proof:

Suppose $\Sigma_{(i \in I)}{a_i} = C$ converges. Then for every $\epsilon > 0$ there is a finite set $I_0$ so that

$|\Sigma_{(i\in K)}{a_i} - C| < \frac{\epsilon}{2}$

where $I_0 \subset K \subset J$, $J$ a finite subset of $I$ \ $I_0$, and $K = I_0 \cup J$

Then

$|\Sigma_{(i\in I_0 \cup J)}{a_i} - C| < \frac{\epsilon}{2}$

$|\Sigma_{(i\in I_0)}{a_i} - C| < \frac{\epsilon}{2}$

By subtracting these two inequalities and remembering that since $I_0$ and $J$ are disjoint

$\Sigma_{(i\in I_0 \cup J)}{a_i}= \Sigma_{(i\in I_0)}{a_i} + \Sigma_{(i\in J)}{a_i}$,

we have $|\Sigma_{(i\in J)}{a_i}| < \epsilon$


(For reference, I am getting this proof from Elementary Real Analysis by TBB. Also, I understand the Cauchy criterion for sequences.)

1

There are 1 best solutions below

3
On BEST ANSWER

If $\lvert a\rvert<\frac\varepsilon2$ and $\lvert b\rvert<\frac\varepsilon2$, then you know that$$-\frac\varepsilon2<a<\frac\varepsilon2.\tag1$$and that $-\frac\varepsilon2<b<\frac\varepsilon2$, which is equivalent to$$-\frac\varepsilon2<-b<\frac\varepsilon2.\tag2$$But if you add $(1)$ and $(2)$, what you get is that $-\varepsilon<a-b<\varepsilon$; in other words, $\lvert a-b\rvert<\varepsilon$.