Why do these two methods of finding Taylor's expansion generate the same result?

352 Views Asked by At

Let's take: $f(x)=e^{2x+1}$

We can find the Taylor-expansion centered around $-\frac{1}{2}$ in two ways:

1) Use the definition: $f(x)=f(a)+\frac{f'(a)(x-a)}{1!}+\frac{f''(a)(x-a)^2}{2!}+\frac{f^{(3)}(a)(x-a)^3}{3!}...$ for a=$-\frac{1}{2}$

2) Expand $e^u$ around $u=0$ which gives: $e^u=1+u+\frac{u^2}{2!}+\frac{u^3}{3!}$ and then substituting $u=2x+1$

Both ways yield the same solution around $x=-\frac{1}{2}$, namely: $$f(x)=1+\frac{(2x+1)}{1!}+\frac{(2x+1)^2}{2!}+\frac{(2x+1)^3}{3!}+...$$

Question: Why do both methods yield the same result? It "feels" to me as if in the second method we are disregarding the chainrule? I am trying to understand why "under the hood" both methods amount to the same result.

Basically the question boils down to the question of why the substitution is allowed. I think this answer has something to do with it: https://math.stackexchange.com/a/1855698/245761

3

There are 3 best solutions below

0
On BEST ANSWER

There are two points to consider here:

1) Validity of a formula for the range of the values of parameters involved in the formula. The formula $$e^{x} = 1 + x + \frac{x^{2}}{2!} + \cdots\tag{1}$$ is valid for all values of $x$ whether real or complex and hence we can replace $x$ by anything which takes real or complex values and hence replacement of $x$ by $2x + 1$ is valid. For that matter you can replace $x$ by a complicated expression like $\sqrt{1 + x^{2}}$ (or more interestingly by $\log x$) and the result will be true. But in such case the resulting series will not be a Taylor series. But if you replace $x$ by $ax + b$ then it will turn out to be a Taylor series in powers of $(x + b/a)$ which will match the Taylor expansion around $x = -b/a$.

2) Next is the concept of Taylor series of function and it can be proved that if a function $f(x)$ has a Taylor series expansion around point $a$ in powers of $(x - a)$ then this Taylor series is unique. By definition of Taylor series, the coefficient of $(x - a)^{n}$ in this expansion is $f^{(n)}(a)/n!$. The uniqueness of Taylor's series says that if we can express $f(x)$ as a power series in powers of $(x - a)$ then the function $f$ is infinitely differentiable around $a$ and the coefficients have to be $f^{(n)}(a)/n!$ no matter how the expansion of $f$ in powers of $(x - a)$ is obtained.

0
On

The Taylor expansion is unique: If $f(x) = \sum_n a_n x^n$ for $|x| < R$, then $f(x) = 0$ iff $a_n = 0$ for all $n$.

In particular, if you find $f(x) = \sum_n a_n x^n =\sum_n b_n x^n$ and both series are valid for $|x| < \rho$, where $\rho >0$, then we must have $a_n =b_n$ for all $n$.

The general result (see Conway, "Functions of one complex variable", ch.IV, Theorem 3.7) is if $U$ is an open connected set and $f$ is analytic on $U$ then $f=0$ iff there is some $x \in U$ such that $f^{(n)}(x) = 0$ for all $n$ iff the set of zeroes has a limit point in $U$.

0
On

Let $f(x) = g(\alpha(x-a))$. Then, by chain rule, $$f^{(n)}(a) = \alpha^n g^{(n)}(0).$$ Thus, \begin{align*} f(x) &= \sum_{n=0}^\infty \frac{f^{(n)}(a)}{n!}(x-a)^n \\ &= \sum_{n=0}^\infty \frac{\alpha^n g^{(n)}(0)}{n!} (x-a)^n \\ &= \sum_{n=0}^\infty \frac{g^{(n)}(0)}{n!}(\alpha(x-a))^n. \end{align*}