Does an induction argument work when $n \rightarrow \infty$?

350 Views Asked by At

My question is whether induction (and backward induction) can be applied as $n \rightarrow \infty$.

To explain what I mean, I will show inductively that $a_1=\ldots=a_{n}=0$ is the only solution to $$ 0 = a_1x + (a_1+a_{2})x^{2}+ \ldots+(a_1+\ldots+a_{n})x^{n}, \qquad \forall \ x\in (0,1] $$ For $x$ small enough, we must have that $a_1=0$. Assume that $a_1=\ldots=a_{i-1}=0$, then
$$ 0 = a_ix^i + (a_{i}+a_{i+1})x^{i+1}+ \ldots+(a_i+\ldots+a_{n})x^{n}, \qquad \forall \ x\in (0,1] $$ so that for $x$ small enough we must have $a_i=0$. By induction, $a_0=\ldots=a_n=0$.

My question is if and where does this solution break down as $n \rightarrow \infty$. Specifically does the same inductive reasoning work to show that $0=a_1=a_2=\ldots$ when $$ 0 = \sum_{i=1}^\infty x^i \left(\sum_{j=1}^{i} a_{j} \right), \qquad \forall \ x\in (0,1] $$

Also what about backward induction. For example if $$ 0 = \sum_{i=1}^n x^i \left(\sum_{j=i}^{n} a_{j} \right), \qquad \forall \ x\in \mathbb{R} $$ then by backward induction I can show that $0=a_1=\ldots=a_n$ (i.e., for $x$ large enough, we must have that $a_n=0$. Now assume that $a_n=\ldots=a_{i+1}=0$ then for $x$ large enough we must have $a_i=0$). Does the same reasoning apply to show that $0=a_1=\ldots$ when $$ 0 = \sum_{i=1}^{\infty} x^i \left(\sum_{j=i}^{\infty} a_{j} \right), \qquad \forall x\in \mathbb{R}? $$

Edit: I have added a bounty because I would like to get more intuition for why the induction argument fails. Specifically,

  • Assuming convergence of the function, say the function is analytic, can you provide an example where the induction argument holds for finite $n$ and then falls apart when $n \rightarrow \infty$?

  • What is the intuition that the argument breaks down?

  • The proof that if an analytic function equals zero in an open ball implies that the function is exactly zero uses the maximum modulus principle (https://en.wikipedia.org/wiki/Maximum_modulus_principle) rather than first using a series expansion and then showing that all the coefficients are zero. Is there a proof that first uses a series expansion?

  • Is this related to the fundamental theorem of algebra (https://en.wikipedia.org/wiki/Fundamental_theorem_of_algebra) and why it cannot be expanded to an infinite dimensional polynomial?

3

There are 3 best solutions below

3
On BEST ANSWER

No. In general, proving a statement for positive integer $n$ by induction, does not prove it "as $n \to \infty$". Induction only proves the statement for each positive integer.

In fact, for many statements, it does not even make sense to take $n \to \infty$. Even when it does, there is always an extra element that has to be introduced. For example, in your case, your infinite sums have to be defined by some notion of convergence, which wasn't necessary for finite sums. Your proof by induction (for finite $n$) never made use of this notion of convergence, so it couldn't possibly be right to just take the limit as $n \to \infty$ and apply the same proof.

6
On

I'll answer here the questions in your Edit : your reasoning worked for a fixed $n$, so although in this particular case the answer with the infinite series works, there would have some work to be done.

There is a proof using the series expansion, if you have, say around $0$, the following $f(x) = \displaystyle\sum_{n=0}^{\infty} a_n x^n$ (assuming of course the convergence radius to be $>0$), then $f$ is $C^{\infty}$, and one can calculate its derivative term by term and one gets $\frac{f^{(n)}(0)}{n!} = a_n$. Now if it equals zero on an open ball around zero, then its derivatives are $0$ too, so the $a_n$'s are zero too, and so $f=0$.

It is in fact related to the fundamental theorem of algebra, which can be proved using analytic functions (through Liouville's theorem). And I think the proof could be adjusted to fit some analytic functions (not all of them, though, I suspect. I would have to look into the details of the proof)

2
On

Theorem: If $(a_{k})_{k=0}^{\infty}$ is a sequence of complex numbers, and if $$ \sum_{k=0}^{\infty} a_{k} x^{k} = 0\quad\text{for all real $x$ in some neighborhood of $0$,} $$ then $a_{k} = 0$ for all $k$.

The simplest proof I know uses nothing but continuity of a convergent power series, i.e., the fact that if $\sum_{k} a_{k} x^{k}$ converges in some neighborhood $I$ of $0$, and if $f(x)$ denotes the sum for $x$ in $I$, then $f$ is continuous in $I$. (The details involve estimating the tail of a convergent power series by a convergent geometric series. Nothing as fancy as Taylor's theorem or the maximum modulus theorem is needed.)

The argument is contrapositive. Assume $a_{n} \neq 0$ is the first non-zero coefficient, i.e., $a_{k} = 0$ for $0 \leq k < n$. Writing $$ f(x) = \sum_{k=0}^{\infty} a_{k} x^{k} = \sum_{k=0}^{\infty} a_{n+k} x^{n+k} = x^{n} \sum_{k=0}^{\infty} a_{n+k} x^{k} = x^{n} g_{n}(x), $$ the function $g_{n}$ is continuous in some neighborhood of $0$, and $g_{n}(0) = a_{n} \neq 0$, so that $g_{n}$ is non-vanishing in some neighborhood of $0$. It follows that $f$ is non-vanishing is some deleted neighborhood of $0$. Contrapositively, if $f$ vanishes identically in some neighborhood of $0$, there is no non-zero coefficient of smallest index, i.e., $a_{k} = 0$ for all $k$.


If I understand your motivating questions, the answer to each is yes. Specifically,

  1. If $$ \sum_{k=1}^{\infty} x^{k} \left(\sum_{j=1}^{k} a_{j}\right) = 0\quad\text{for all $x$ in $(0, 1]$,} $$ then $a_{j} = 0$ for all $j$.

  2. If $$ \sum_{k=1}^{\infty} x^{k} \left(\sum_{j=k}^{\infty} a_{j}\right) = 0\quad\text{for all real $x$,} $$ then $a_{j} = 0$ for all $j$.

In each case, the theorem guarantees you're "really" assuming the coefficients in parentheses vanish for all $k$ (and, in the second, that the indicated series converge for each $k$). The fact that $a_{j} = 0$ for all $j$ follows by elementary algebra.

I'm not sure what you mean by the fourth question. As a general matter, I all but forbid my students from using pronouns; it's important to know (for oneself) exactly what noun is being discussed. If you can't decide what noun to use, there's a detail you don't understand. To annotate my uncertainties:

Is this[1] related to the fundamental theorem of algebra and why it[2] cannot be expanded[3] to an infinite dimensional[4] polynomial?

  1. "This" means "this phenomenon"? And does that mean "the fact that finite induction does not always extend to the limit as $n \to \infty$", some property of the series you asked about, something else?

  2. The fundamental theorem of algebra? Inductive proof?

  3. Extended?

  4. Degree (i.e., to an entire function)?

Anyway, I think the answer is "no", there's no connection to the fundamental theorem of algebra. You're really asking local questions about convergent power series, not global questions about polynomials and other entire functions.