Existence of Taylor series and Analyticity

589 Views Asked by At

I have realised my understanding of Taylor series is not as complete as I would like it to be and therefore have formulated some questions which I am struggling to find answers that I can understand:

  1. Is the following statement true: a function is analytic at a point $\iff$ its Taylor series exists at that point?
  2. I know that not all smooth functions have a Taylor expansion but for the functions that I have seen I don't really understand why, e.g. $f(x)=e^{−1/x^2}, x>0, f(x)=0, x⩽0$.
  3. On Wikipedia it says, "A function may differ from the sum of its Taylor series, even if its Taylor series is convergent." When would this be the case and why does it not converge to the function?
  4. If at a point the first derivative exists can an infinite amount of derivatives be taken?
  5. If a function is analytic does it mean that it has a Taylor series at every point?

I know that there are quiet a few questions here so help with even one would be greatly appreciated.

2

There are 2 best solutions below

0
On BEST ANSWER
  1. No. A function $f(x)$ is analytic at a point $x_0$ if its Taylor series exists at $x_0$ and converges to $f(x)$ in a neighborhood of $x_0$. The Taylor series may exist but not converge, or it may converge but not to $f(x)$.

  2. The basic idea is that $f(x)$ decreases so quickly as $x \to 0$ that all of its derivatives exist and are equal to $0$ at $x = 0$; more formally we have $\lim_{x \to 0} \frac{f(x)}{x^m} = 0$, which can be established e.g. by taking logarithms. So the Taylor series at $x = 0$ has all coefficients zero, which means it doesn't converge to $f(x)$ in any neighborhood of $0$. On the other hand, for $x > 0$, $f(x)$ is a composition of smooth functions and hence smooth.

  3. The previous function is an example; the Taylor series at $x = 0$ converges to zero in any neighborhood but the function doesn't. Generally, the way we prove that a Taylor series converges (when it does) is to use Taylor's theorem with remainder, which bounds the error of the approximation given by taking the first $n$ terms of the Taylor series. But it can happen that this bound doesn't go to $0$ as $n \to \infty$; if that's the case then Taylor's theorem doesn't tell us that the Taylor series converges to the function.

  4. Not necessarily. For example, the function $f(x) = x^k |x|$ is differentiable at $x = 0$ exactly $k$ times, but not $k+1$ times.

  5. Yes; more precisely it means that at every point the Taylor series exists and converges to $f(x)$ in some neighborhood.

As an additional comment on question 2, if a function $f(x)$ is analytic at $x = 0$ then it must decay at worst like $x^m$ as $x \to 0$, where $m$ is the first index such that $f^{(m)}(0) \neq 0$. So analytic functions can't decay faster than polynomially at any point.

0
On

The most usual counterexample for question 3 is given in your question 2:
$f(x)=e^{-\frac 1 {x^2}}$ for $x \ne 0$, and $f(0)=0$.

$f$ is continuous and $C^\infty$ on $0$: its derivatives are $f^{(n)}(x) = R_n(x) e^{-\frac 1 {x^2}}$,
where $R_n(x)$ is a rational fraction in $x$ (a fraction of polynomials in $x$). This can be proved by recursion on $n$.
As $e^{-\frac 1 {x^2}}$ tends to $0$ stronger than any $\frac 1 {x^k}$ tends to $\infty$ on $0$, all derivatives of $f$ are null on $0$.
So the Taylor series of $f$ in $0$ exists and converges: it is the constant null function. But $f$ is not constantly null, so $f$ is not equal to its Taylor series on any neigborhood of $0$.

(Actually the key point may be that when people write "the function is equal to its Taylor series on $x_0$", they mean "the function and the Taylor series on $x_0$ are equal on a neighborhood of $x_0$", not just on $x_0$. A function is always trivially equal to its $x_0$ Taylor series on $x_0$).

This is also a counterexample to question 1: if $f$ would be analytic on $0$, it would be equal to its Taylor series on some neighborhood of $0$, which it is not.

Question 4: if a function has a derivative on $x_0$, it does not imply it has a second, third, etc., derivative on the same point. Take for example $f(x) = x^{3/2}$. It has a first derivative $f'(x)=\frac 3 2 \sqrt x$, which is null on $0$. But then the second derivative $f^{(2)}(x)= \frac 3 4 x^{- 1/2}$ is not defined on $0$.

Question 5: if a function is analytic on every point, it has a Taylor series on every point, which converges to the function's value on a neighborhood of each point. Cf. for example Wikipedia's page on analytic functions.