Doubt about Taylor series: do successive derivatives on a point determine the whole function?

1.9k Views Asked by At

I'm currently relearning Taylor series and yersterday I thought about something that left me puzzled. As far as I understand, whenever you take the Taylor series of any function $f(x)$ around a point $x = a$, the function is exactly equal to its Taylor series, that is:

$$ f(x) = \sum_{n=0}^{\infty} \frac{f^{(n)}(a)}{n!}(x-a)^n $$

For example, if we take $f(x) = e^x$ and $x = 0$, we obtain: $ e^x = \sum_{n=0}^{\infty} \frac{x^n}{n!} $

My doubt is: the only variables in the Tayor series formula are $f(a), f'(a), f''(a),$ etc., that is, the successive derivatives of the function $f$ evaluated in one point $x = a$. But the Taylor series of $f(x)$ determine the whole function! How is it possible that the successive derivatives of the function evaluated in a single point determine the whole function? Does this mean that if we know the values of $f^{(n)}(a)$, then $f$ is uniquely determined? Is there an intuition as to why the succesive derivatives of $f$ on a single point encode the necessary information to determine $f$ uniquely?

Maybe I'm missing a key insight and all my reasoning is wrong, if so please tell where is my mistake.

Thanks!

4

There are 4 best solutions below

6
On BEST ANSWER

You're right, in general $f$ is not determined by its derivatives at one single point. Functions satisfying this condition are called analytic. But not all smooth functions are analytic, for example

$$x\mapsto\left\{\begin{array}{c}e^{-\frac{1}{x^2}}, x>0\\0, x\leq 0\end{array}\right.$$ is a smooth function and the derivatives at zero are all zero, hence the Taylor series developed at zero does not determine the function.

Furthermore the exact statement of Taylor's theorem is quite different from what you said. It is as follows:

If $f\in C^{k+1}(\mathbb{R})$, then $$f(x)=\sum_{n=0}^k f^{(n)}(a)(x-a)^n\frac{1}{n!} + f^{(k+1)}(\xi)\frac{1}{(k+1)!}(x-a)^{k+1}$$

If you now take $k\rightarrow\infty$ it is in general not clear, that this error term converges to zero.

2
On

Functions which are the sum of their Taylor series within the interval (or disk for functions of a complex variable) of convergence are known as analytic functions. Many basic elementary functions are analytic: $\;\exp, \sin,\cos,\sinh,\cosh $ and of course polynomials are analytic on $\mathbf R$ (or $\mathbf C$).

It is not true that, in general, an infinitely differentiable function of a real variable is analytic on the interval of convergence of its Taylor series, as @humanStampedist's example shows.

However, for a function of a complex variable, simply being differentiable suffices to ensure the function is analytic (one usually says holomorphic in this case). This is due to the very strong constraints of the Cauchy-Riemann equations.

3
On

One doesn't have to try very hard to find a function that does not agree with its Taylor series everywhere. The absolute value function is a familiar enough function. The Taylor series of $|x|$ at $x = 1$ is $x$. (The constant term is zero and all the higher degree terms are zero. If you ignore the left half of the graph of $|x|$, you should see that this function is "trying" to be a straight line in any reasonably small and/or bounded-above-zero-on-the-left open neighborhood of the expansion point, $1$.)

This Taylor expansion is identical to the function on $x \geq 0$, and is hilariously wrong for $x < 0$. However, on expanding a Taylor series at any point on the left half of the real line gives $-x$. This is identical to the function on $x \leq 0$ and hilariously wrong on the right half of the real line.

Why did the Taylor series not "work" everywhere? In any little neighborhood of an $x$ that does not include $0$, the function $|x|$ looks like either a line with slope $1$ or a line with slope $-1$, so this is all the derivatives can see. The sudden change in behaviour at $x = 0$ is not signalled in the derivatives anywhere (except that none of the derivatives exist at $x = 0$). It's almost as if the undefinedness at $x=0$ acts as a barrier -- the Taylor series on one side of that barrier does not replicate behaviour from the other side. (... except in carefully contrived accidents, like $\frac{x^2}{x}$ which is undefined at $0$ so has no derivatives there, but agrees with any of its Taylor series.)

0
On

HumanStampedist has adequately answered the question. I'd like to mention that, just as there are continuous functions that are nowhere differentiable, such as the Weierstrass function, there are smooth functions (all $n$th derivatives exist at every point) that are nowhere analytic, i.e. at no point does the Taylor series converge to the original function. An example is the Fabius function.