I have always had this basic question about Taylor Approximations, but I never asked it before because I always thought it was too obvious. At this point, the way we are (briefly) introduced to Taylor Approximations in school (e.g. in the engineering faculty) it is almost considered "obvious" that "as the number of terms increases in the Taylor Approximation, the error reduces:
But are there any ways to demonstrate that "increasing the number of terms in the Taylor Approximation will necessarily reduce the Approximation Error"?
Or is this question so basic that the proof is "trivial"? I.e.
An infinite term Taylor Approximation perfectly approximates the original Function
There is a formula for the Approximation Error for a N-th term Taylor Approximation
Thus, since a Taylor Approximation with (N+1) Terms contains all the terms of a Taylor Approximation with N Terms - using the properties of addition and infinite series, the N+1 term Taylor Approximation Error can not be equal to the N term Taylor Approximation Error and also can not be greater than the N term Taylor Approximation Error: therefore, the error from a higher term Taylor Approximation must always be less than the error from a lower Taylor Approximation.
Is this correct?
Thanks!
Sources:
