If you have two different kinds of series representations like $$ \sum_{n=1}^{ \infty}f(n)=\sum_{k=1}^{ \infty}g(k),$$ does it follow that $f(n)=g(k)$?
Can you algebraically invert an infinite sum?
319 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
Let $f(n)=\frac{\pi^2}{6\cdot 2^n}$ and $g(k)=\frac{1}{k^2}$. Then
$$\sum_{k=1}^{\infty}\frac{1}{k^2}=\frac{\pi^2}{6}$$
$$\sum_{n=1}^{\infty}\frac{\pi^2}{6\cdot 2^n}=\frac{\pi^2}{6}$$
Really if your functions were power series in some variable power as mentioned in the comment by JMoravitz,
$$F(x)=\sum_{n=1}^\infty f(n)x^n$$ $$G(x)=\sum_{k=1}^\infty g(k)x^k$$
Then if $F(x)=G(x)$, then $f(n)=g(k)$ whenever $n=k$.
On
There's a property of a collection of things called "linear independence". It means that no element of this collection can be found as a sum of other elements with arbitrary constant coefficients.
For example, if we have a set of vectors $\vec{a}, \vec{b}, \vec{c}$, we can call them linearly independent if for any real numbers $p,q,s$ with $p+q+s \neq 0$ we have:
$$p\vec{a}+ q\vec{b}+ s\vec{c} \neq 0$$
In the same way for functions $f(x),g(x),h(x)$ linear independence means that there's no such real numbers $p,q,s$ with $p+q+s \neq 0$ for which this equation is true for any $x$:
$$p \cdot f(x)+ q \cdot g(x)+s \cdot h(x)=0$$
Obviously, there may be a few values of $x$ where the equation holds, but it can't hold for the whole domain.
Why is this important for your question?
Any finite collection of constants is obviously linearly dependent. We can always find some coefficients to make them sum to $0$. Which is why the equality of sums of constants doesn't mean the equality of each term.
Trivially $1+3=2+2$ doesn't mean that $1=2$ or $3=2$.
For infinite sums things become complicated, but it still can be shown that not even an infinite collection of constants is linearly independent.
However, there are some families of functions which can be shown to be linearly independent.
One such family is monomials $1,x,x^2,x^3, \dots$. Which is why power series are so popular. If two power series converge and are equal for any $x$, then we know that their terms are also equal (as Eleven-Eleven says).
There are many other families of linearly independent functions, for example, trigonometric functions in the form:
$$\sin n t, \cos n t, \qquad n=0,1,2,3, \cdots$$
They are in fact almost the same as monomials, if we change $x=e^{it}$, and use the known facts from complex analysis.
These functions are the basis of Fourier series, which also have the same property of their terms being equal if the whole series are equal pointwise.
Try $f(n)=2^{-n-1}$ and $g(k)=3^{-k}$