Introduction to Analysis: Multiplication Theorem for Series

446 Views Asked by At

I've been stuck on this problem over the weekend so I decided to ask for some direction. The problem reads:

"The multiplication theorem for series requires that the two series be absolutely convergent; if this condition is not met, their product may be divergent. Show that the series $\sum_0^∞ \frac{(-1)^i}{\sqrt{1 + i}}$ gives an example: it is conditionally convergent, but its product with itself is divergent. (Estimate the size of the odd terms $c_{2n+1}$ in the product.)"

Someone earlier suggest I show the Cauchy product and show that it diverges. If I understand correctly, the two series must converge absolutely. If I can show that two do not converge, then I have shown what the problem ask for.

So I tried and had this in mind. I reckon the prove is not correct because I do not fully understand how to prove Cauchy product, but it's an idea.

Consider $a_n$ = $(-1)^n$, $b_n = \frac{1}{\sqrt{n+1}}$. The sum of their product, $C_n$ = $\sum_0^{\infty} c_i$ = $\sum_0^{\infty}\frac{(-1)^i}{\sqrt{1 + i}}$ conditional converges. Thus $\forall$$\epsilon$ > 0, $\exists$N s.t. n, m $\geq$ N $\rightarrow$ |$\sum_n^m c_i$| < $\epsilon$.

We assume without loss of generality that the series of $a_n$ and the series of $b_n$ does not converge absolutely.

|$\sum_0^{\infty} (-1)^i$| $\leq$ $\sum_0^{\infty} |(-1)^i|$ $\leq$ $\sum_0^{\infty} 1^i$

As n approaches infinity, the $\sum_0^{\infty} 1^i$ = $1 + 1 + ... + 1 = \infty$, thus diverging.

|$\sum_0^{\infty} \frac{1}{\sqrt{i+1}}$| = $\sum_0^{\infty} |\frac{1}{\sqrt{i+1}}|$ = $\sum_0^{\infty} \frac{1}{\sqrt{i+1}}$.

As n approaches infinity, the $\sum_0^{\infty} \frac{1}{\sqrt{i+1}}$ = $\infty$, thus diverging.

So I understand the idea, I just don't know how to go about finishing up the proof using cauchy.

Thank you for taking the time to read this and thanks in advance for commenting.

1

There are 1 best solutions below

0
On

There are several things wrong with your proof. I'll give some hints on how to proceed later, but first some points about your proof:

  • You are not finished when you have shown that the sums over $a_n$ and $b_n$ do not converge absolutely. See below to find what you actually have to prove.
  • Your assumption about the sums over $a_n$ and $b_n$ without loss of generality. An assumption "without loss of generality" must be somehow justified. For example, when proving something about a symmetric bilinear form of a finite dimensional vector space over $\mathbb{C}$ you can assume without loss of generality that it is represented by a diagonal matrix. This is because you can choose a basis such that the representation with respect to this basis is a diagonal matrix. Note that this only holds if choosing another basis does not change the rest of your argument. In your case, the series either converge absolutely or they don't. There is no possibility to assume this without loss of generality. See wikipedia for a better example than mine.
  • You give an upper bound for the sum over $a_n$ and then show that this upper bound diverges. This does not tell you anything about $\sum a_n$. For an extreme example, consider $\sum_{n=1}^\infty 0 \le \sum_{n=1}^\infty 1 = \infty$, but the first sum clearly converges (absolutely). Also see here.
  • For absolute convergence, you need to show that $\sum |a_n|$ converges, not $|\sum a_n|$.
  • You tagged your question (cauchy-sequence). The Cauchy product does not have anything to do with Cauchy sequences.

Now, to tackle your problem, choose $a_n := (-1)^n/\sqrt{n+1}$. First you need to show that $\sum_{n=0}^\infty a_n$ converges (conditionally). The Leibniz criterion does this job for you. Now consider the product of the series with itself. We obtain the Cauchy product $$ ( \sum_{n=0}^\infty a_n )( \sum_{n=0}^\infty a_n ) = \sum_{n=0}^\infty c_n, $$ where $$c_n = \sum_{i=0}^n a_i a_{n-i} = \sum_{i=0}^n \frac{(-1)^{i+n-i}}{\sqrt{(i+1)(n-i+1)}}.$$ You need to show that the sum over $c_n$ diverges. The hint in your problem (as i understand it, it is a little bit ambiguous) tells you to consider $$c_{2n+1} = \sum_{i=0}^{2n+1} \frac{(-1)^{2n+1}}{\sqrt{(i+1)(2n+1-i+1)}} = \sum_{i=0}^{2n+1} \frac{-1}{\sqrt{(i+1)(2n-i)}}.$$ Here is an additional hint.