I need help with this problem:
Let $f$ be a continous function over an interval that contains $0$. Let $a_n=f(\frac 1 n)$ (for $n$ large enough).
I've already showed that:
If $\sum_{n=1}^\infty \ a_n$ converges, then $f(0)=0$.
If $f'(0)$ exists and $\sum_{n=1}^\infty \ a_n$ converges, then $f'(0)=0$.
If $f''(0)$ exists and $f(0)=f'(0)=0$, then $\sum_{n=1}^\infty \ a_n$ converges.
How solve this one:
- Suppose that $\sum_{n=1}^\infty \ a_n$ converges. Must f'(0) exist?
- Suppose that $f(0)=f'(0)=0$. Should $\sum_{n=1}^\infty \ a_n$ converge?
For (1), consider the function given by $f(0)=0$, and $f(x) = x\sin \frac{\pi}{x}$ for $x\neq 0$.
For (2), consider the function defined by $f(0)=0$, and $f(x) = \frac{x}{\ln x}$ for $x\neq 0$. (And recall that $\sum_{n=2}^\infty \frac{1}{n\ln n} = \infty$, e.g., by the integral test.)