I understand what a Taylor series is and how to find the Taylor series of a function. However I do not understand intuitively what it means to find a Taylor series for a specific function, centered at $x=a$ compared to $x=b.$ Can someone explain please?
What does it mean intuitively for a Taylor Series to be centered at a specific point?
29.4k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 6 best solutions below
On
A Taylor series is an approximation of a function with 'easy' functions, namely polynomials. A major difference between a Taylor series centered at one point vs. another is the domain over which the series converges.
For example the Taylor series for $\displaystyle f(x) = \frac{1}{1-x}$ centered at $x = 0$ is
$$T_0(x) = 1 + x + x^2 + x^3 + x^4 + ...$$
This series converges on the interval $(-1,1)$. In other words, for all $x \in (-1,1), T_0(x) = f(x)$. However outside that interval, not so. For instance $f(-2) = 1/3$ but
$$T_0(-2) = 1 - 2 + 4 - 8 + 16 + ...$$ clearly doesn't converge.
Were we to write out the Taylor series of $f(x)$ centered at $x = -1$, we would have,
$$T_{-1}(x) = \frac{1}{2} + \frac{x+1}{4} + \frac{(x+1)^2}{8} + \frac{(x+1)^3}{16} + ...$$
This series, $T_{-1}$, converges instead on the interval $(-3,1)$. The expression for $T_{-1}(-2)$ does converge (check!).
Some Taylor series converge everywhere, such as the Taylor series for $e^x$ or $\sin x$.
On
Notice that in general, $F(x-a)$ slides the function $F(x)$ so that the new origin is located at $x=a$.
When we compute the Taylor series about zero, we're evaluating a power series in the form of $F(x) = \sum \frac{f^{(k)}(0)}{k!} x^k$.
When we compute the series about a point $a$, we're computing $F(x-a) = \sum\frac{f^{(k)}(a)}{k!} (x-a)^k$, which is essentially "sliding" the polynomial over to the point $a$ while making sure that the coefficients of the sum match the behavior of $f(x)$ properly near $a$.
On
Intuitively, it means that you are anchoring a polynomial at a particular point in such a way that the polynomial agrees with the given function in value, first derivative, second derivative, and so on. Essentially, you are making a polynomial which looks just like the given function at that point. [Then Taylor's theorem guarantees that if continued to an infinite number of terms, the series will be the same as the given function.]
On
As rightly answered first by Nick and as reflected by others, Taylor's Polynomial (T) is a polynomial which looks just like the given function f(x) i.e. resembles/ approximates the real function as close as possible.
T about a point A is the value given by it when x=a. Since the T is just an approximation polynomial to the real function f(x), there will be an error. This error indicates that how close is the polynomial (T) to the real function f(x).
f(x)=sin x at x=0 and x=pi/2 is 0 and 1 in true terms. Say by Using Taylors' polynomials to find sin x at x=0, pi/2, we can get 0 and 0.9 respectively. This means that the Taylor's polynomial does approximate a value but with an error of 0.1.
I hope this clears your question.
On
A Taylor series of a function $f(x)$, around $x=a$, when truncated at the $n$-th term, represent a polynomial of degree $n$ which - intuitively speaking- interpolates the function at the points $a, a+dx, a+2dx, \cdots, a+ndx$ with $dx \to 0$.
Then it is clear what is the difference between the (truncated) Taylor series around $x=a$ and $x=b$ : simply a polynomial interpolating $f(x)$ near $x=a$ and near $x=b$.
centered at $a$ means the choice of the point $a$ in the formula $$\sum_{n=0} ^ {\infty} \frac {f^{(n)}(a)}{n!} \, (x-a)^{n}$$
of the taylor series.