I can't find the mistake in my proof

116 Views Asked by At

I thought about something in power series and I proved a theorem that I have never seen before:
The theorem:

Let $\begin{align} \sum_{n=1}^{\infty} a_nx^n \end{align}$ be a power series, with a convergence range of $R$(doesn't matter if its closed, open, half closed half open, or half open halp closed range). For every $\{x_n\}_{n=1}^\infty$ sequence that has only a finite number of elements that satisfy $x_n\notin R$, then the series $\begin{align}\sum_{n=1}^\infty a_nx_n^n\end{align}$ converges.

The proof:

Let $\begin{align}\sum_{n=1}^\infty a_nx^n, \{x_n\}_{n=1}^\infty\end{align}$ as in the wording. We will define $A\subset\{x_1,x_2,\cdots,x_n,x_{n+1},\cdots\}$ be the group of elements that are not in the range of $R$. i.e if $x_i\in A$, then $x_i\notin R$. Now we will look at $A$: $A=\{x_{1^{'}},x_{2^{'}},\cdots,x_{m^{'}}\}$. We know that $|A|\in\mathbb{N}$, which means $m^{'}\in\mathbb{N}$. From here, we will look at the series: $\begin{align}\sum_{n=m^{'}+1}^\infty a_nx_n^n\end{align}$.
We know that this series converges because every $j>m{'}$ satisfies $x_j\in R$.
But we know that $\begin{align}\sum_{n=0}^\infty b_n\end{align}$ converges iff $\begin{align}\sum_{n=m}^\infty b_n\end{align}$ converges. So we will define $b_n=a_nx_n^n$, and $m=m^{'}+1$.
And so $\begin{align}\sum_{n=0}^\infty a_nx_n^n\end{align}$ converges iff $\begin{align}\sum_{n=m^{'}+1}^\infty a_nx_n^n\end{align}$ converges, which means $\begin{align}\sum_{n=0}^\infty a_nx_n^n\end{align}$ converges.$\square$

The problem is that I found a contradiction to my theorem, so I think that I made a mistake in my proof, but I can't find where it is.
The contradiction:

The convergence range of $\begin{align}\sum_{n=0}^\infty \frac{x^n}{n}\end{align}$ is $R=[-1,1)$. I'll choose $x_n=(1.01)^{-\frac{1}{n}}$. It's easy to see that for every $n>0$ we get $x_n\in R$. So by the theorem, the series $\begin{align}\sum_{n=0}^\infty \frac{{({(1.01)}^{-\frac{1}{n}})}^n}{n}=\sum_{n=0}^\infty \frac{1.01^{-1}}{n}=\frac{1}{1.01}\sum_{n=0}^\infty \frac{1}{n} \end{align}$ should converge, but it is obviously does not.

Note - Other than helping me with finding out what's the problem with my proof, if you know about some similar theorem please let me know.

1

There are 1 best solutions below

5
On BEST ANSWER

We know that this series converges because every $j>m'$ satisfies $x_j\in R$.

Here is the problem. You are forming the sum $\sum_{n=m'+1}^\infty a_nx_n^n$, where each individual term $a_n x_n^n$ is taken from a convergent series $\sum_{k=m'+1}^\infty a_k x_n^k$, but that isn't sufficient to establish convergence of the series. As you know, you can remove any finitely many terms from a series without affecting convergence, so you get basically no control over individual terms. As a crude example, if we fix any $k \in \Bbb{N}$, the series $\sum_{n=0}^\infty \delta_{nk}$ converges to $1$, where $\delta_{nk}$ is equal to $1$ when $n = k$, and $0$ otherwise. But, $\sum_{n=0}^\infty \delta_{nn} = \sum_{n=0}^\infty 1 = \infty$.

But, I can see that convergence in this case looks more plausible than the generalisation of your reasoning, given that all these points lie within the radius of convergence. Your counterexample shows that even this is not enough, however. If your $x_n$s approach an open edge of your interval of convergence quickly enough, then your series may approximate the edge case divergent series better than it approximates any of the convergent series it samples from.