Does the Convolution of Two Series Require Absolute Convergence?

1.9k Views Asked by At

Let $A=(a_n)_{n=0}^\infty$ and $B=(b_n)_{n=0}^\infty$ be sequences of real numbers. Let $C = (c_n)_{n=0}^\infty$ be the sequence such that

$$c_n = {a_0}{b_n} + {a_1}{b_{n-1}} +\cdots+ {a_n}{b_0}.$$

Let $G_A(s), G_B(s), G_C(s)$ denote the generating functions of sequences $A,B$ and $C$, respectively.

My textbook claims that $$G_C(s) = \sum_{n=0}^{\infty}{c_n}{s^n} = \sum_{n=0}^{\infty} ({\sum_{k=0}^{n}{a_k}{b_{n-k}}}){s^n}= (\sum_{k=0}^{\infty}{a_k}{s^k})(\sum_{n=k}^{\infty}{b_{n-k}}{s^{n-k}}) = G_A(s) G_B(s).$$

I was able to verify the equality in the finite case via induction. In the infinite case, I think that absolute convergence is necessary. However, the book did not mention anything about this.

Going slightly off topic, I'd like to ask if this is why you cannot use induction to prove that two expressions of $n \in \mathbb N$ are equivalent when $n$ goes to infinity.

In this example, induction does not say any thing about convergence. The instructor in my Analysis class made that point, but I did not really understand it at the time.

3

There are 3 best solutions below

2
On BEST ANSWER

Convergence is not an issue: these are formal power series, and as such are to be thought of as algebraic objects with an associated ‘arithmetic’, not as real- or complex-valued functions. As Herbert S. Wilf says in generatingfunctionology, ‘A generating function is a clothesline on which we hang up a sequence of numbers for display’. It is in the first instance a convenient way to manipulate the sequence of coefficients. In particular, if $A=\langle a_n:n\in\Bbb N\rangle$ and $B=\langle b_n:n\in\Bbb N\rangle$ are sequence (not series!) of real numbers, we can construct their convolution, $C=\langle c_n:n\in\Bbb N\rangle$, directly, by defining $C_n=\sum_{k=0}^na_kb_{n-k}$, or we can get it ‘automagically’ by taking the product of the generating functions of $A$ and $B$:

$$\begin{align*} G_A(x)G_B(x)&=\left(\sum_{n\ge 0}a_nx^n\right)\left(\sum_{n\ge 0}b_nx^n\right)\\\\ &=\sum_{n\ge 0}\left(\sum_{k=0}^na_kb_{n-k}\right)x^n\\\\ &=G_{A*B}(x)\;. \end{align*}$$

There’s actually nothing to prove here: the step

$$\left(\sum_{n\ge 0}a_nx^n\right)\left(\sum_{n\ge 0}b_nx^n\right)=\sum_{n\ge 0}\left(\sum_{k=0}^na_kb_{n-k}\right)x^n\;,$$

is a matter of definition. We define the product of formal power series to be the Cauchy product, generalizing the ordinary product of polynomials. Chapter $2$ of generatingfunctionology starts with a brief introduction to formal power series; you can freely download the whole book here.

Now it’s true that sometimes we do want to treat generating functions as analytic objects rather than as purely formal algebraic objects, and then we do have to worry about convergence. But when we have convergence, the operations behave as one would expect.

1
On

Generally speaking, generating functions are discussed as formal power series - that is, as purely algebraic objects, without regards to whether or not they converge. All of the operations that we carry out with them - addition, convolution, differentiation, integration, etc. - are also treated in a purely algebraic context, without regards to convergence.

Of course, you are free to examine the series as an analytical object - for instance, if you are trying to use complex analytic methods to find asymptotics for the coefficients - but in practice I have always just manually dealt with convergence of the new series.

1
On

"Going slightly off topic, I'd like to ask if this is why you cannot use induction to prove that two expressions of $n\in\mathbb N$ are equivalent when n goes to infinity."

Indeed, induction only gives you information about the "finite steps" (recall that infinity is not a natural number). To get information about the limiting behaviour (i.e. when you let $n$ go to infinity), you need analysis; induction is a purely set-theoretic/arithmetic statement.