Show that if $f''(0)$ exists and $f(0)=f'(0)=0$, then $\sum f(\frac1n)$ converges

649 Views Asked by At

I need help with this problem: Let $f$ be a continous function over an interval that contains $0$. Let $a_n = f(\frac 1 n)$ (for n large enough).

  1. Show that if $f''(0)$ exists and $f(0)=f'(0)=0$, then $\sum_{n=1}^\infty \ a_n$ converges.

I've already shown that if $\sum_{n=1}^\infty \ a_n$ converges, then $f(0)=0$ and that if $f'(0)$ exists and $\sum_{n=1}^\infty \ a_n$ converges, then $f'(0)=0$.

How do I show this one?

3

There are 3 best solutions below

2
On

By Taylor's Formula with remainder $|f(x)| \leq cx^{2}$ for some constant $C$ for $x$ in a neighborhoodod $0$. Hence $\sum f(\frac 1 n)$ is dominated by a constant times $\sum \frac 1 {n^{2}}$.

0
On

As $f''(0)$ exists, $f'(0)$ is can be linearly approximated: $$ f'(h) = m\cdot h+ r(h)$$

With some function $r$ with $\frac {r(h)} h\to 0$ for $h\to 0 $. Therefore we can find an interval $U:=[-a,a]$ so that $ r(h) \le m\cdot h$, and by that: $$x\in U\implies f'(x) \le 2 m\cdot x $$

From this follows $f(x) \le mx^2$ per integration for $x\in U$, and thus the sum converges
(as for some $N$ the tail of the sum, $\sum_{i=N}^\infty f(1/i)$, is completely in $U$ for all $i$ it assumes, and we know that $\sum_{i=0}^\infty \frac 1 {x^2}$ converges).

7
On

We can use comparison test based on the following fact $$ \lim_{x\to 0} \frac{f(x)}{x^2}=\lim_{x\to 0}\frac{f'(x)}{2x}=\frac{f''(0)}{2} $$ where L'Hopital's rule is used in the first equation. Thus for all sufficiently large $n\ge N$, $$ n^2|a_n| = n^2\left|f\left(\frac1n\right)\right|\le |f''(0)| $$ hence $$ \sum_{n\ge 1}|a_n| \le \sum_{n=1}^{N-1} |a_n|+\sum_{n\ge N} \frac{|f''(0)|}{n^2}<\infty. $$ The series converges absolutely, which implies that the series converges.