Show that a function with specific bound on its derivatives is analytic

252 Views Asked by At

I'm solving old exam problems in real analysis. Thus, only such methods may be used. I've been trying to solve the problem below and have encountered some issues.

Let $f\in C^\infty(\mathbb{R})$ in a neighborhood of the point $x_0$. Assume that there exists positive numbers $\delta$ and $M$ such that for any $x\in(x_0-\delta,x_0+\delta)$ one has the estimate $$\left|\frac{d^kf(x)}{dx^k}\right|\leq M\frac{k!}{\delta^k}.$$ Show that under these assumptions $$f(x)=\sum_{k=0}^\infty\frac{1}{k!}\frac{d^kf(x_0)}{dx^k}(x-x_0)^k.$$ Note that this means that the estimate above implies that $f(x)$ is analytic at $x_0$.

My first thought is to try to find something that looks what we want by dividing the estimate by $k!/\delta^k$ on both sides. Equivalently, multiply with $\delta^k/k!$ on both sides. Here $$\left|\frac{d^kf(x)}{dx^k}\right|\leq M\frac{k!}{\delta^k}\Leftrightarrow\frac{\delta^k}{k!}\left|\frac{d^kf(x)}{dx^k}\right|\leq M\frac{k!}{\delta^k}\frac{\delta^k}{k!}=M$$ for all $x\in(x_0-\delta,x_0+\delta)$. That $M$ is a fixed number means that we have a bound of the left-hand side of the equation. That $x\in[x_0-\delta,x_0+\delta]$ means that $|x_0-x|\leq\delta$. Thus $(x_0-x)^k\leq\delta^k$. Hence $$\frac{1}{k!}\left|\frac{d^kf(x)}{dx^k}\right|(x_0-x)^k\leq\frac{1}{k!}\left|\frac{d^kf(x)}{dx^k}\right|\delta^k\leq M.$$ This means that the summand is bounded.

I don't really know how to progress from this step, or if anything that I've deduced is of any value. I guess that showing pointwise or uniform convergence might give us what we want, e.g. showing that this converges to a function, but how do we know that is the function that we wanted to begin with? I feel like I'm missing something obvious here.

1

There are 1 best solutions below

7
On BEST ANSWER

If you take $x\in (x_0 - \rho, x_0+\rho)$ with $\rho < \delta$ then the terms of the Taylor series will be bounded by $M\frac{\rho^k}{\delta^k}$ and you can apply the Weierstrass M-test. Hence the series converges absolutely and uniformly on $(x_0 - \rho, x_0+\rho)$ and $f$ is analytic at $x_0$.

EDIT: This argument only shows that the Taylor series defines an analytic function $g$ around $x_0$. To see that $g=f$ one may fix an arbitrary point $x\in (x_0 - \rho, x_0+\rho)$ and use the Lagrange remaider for the Taylor expansion of $f$ up to order $n$ and then use the estimates to show that $|f(x)-g(x)| =0$ letting $n\to \infty$.

$$|f(x)-g(x)| = \left|\frac{f^{(n+1)}(x^\ast)}{(n+1)!}(x-x_0)^{n+1}-\sum_{k=n+1}^\infty\frac{f^{(k)}(x_0)}{k!}(x-x_0)^k\right|\leq \\ \sum_{k=n+1}^\infty\frac{|f^{(k)}(x_0)|}{k!}|x-x_0|^k+\frac{|f^{(n+1)}(x^\ast)|}{(n+1)!}|x-x_0|^{n+1} \leq \sum_{k=n+1}^\infty M\frac{k!}{\delta^k}\frac{|x-x_0|^k}{k!} + M\frac{(n+1)!}{\delta^{n+1}} \frac{|x-x_0|^{n+1}}{(n+1)!}=\sum_{k=n+1}^\infty M\frac{|x-x_0|^k}{\delta^k} + M\frac{|x-x_0|^{n+1}}{\delta^{n+1}}$$ Since $|x-x_0|<\delta$, $\sum_{k=n+1}^\infty M\frac{|x-x_0|^k}{\delta^k}$ is the tail of a convergent geometric series, hence it tends to $0$ as $n\to \infty$. Also $M\frac{|x-x_0|^{n+1}}{\delta^{n+1}}\to 0$ as $n\to \infty$ and we conclude that $|f(x)-g(x)| \leq 0$.