I'm being asked to consider a Runge function $$f(x) = \frac{1}{1+a^2x^2}$$ for arbitrary $a$, and $x \in [-1,1]$, and then compute a truncated Taylor series expansion of the form $$\sum_{k=0}^{N}a_kx^k$$ where the $a_k$ are "suitable coefficients".
I'm a bit confused on how I could go about constructing that sum. It doesn't even look like a Taylor series, so I'm not exactly sure what's expected. Even if I were to try to build a standard Taylor series, thhe derivatives of $f$ get messy very quickly.
Can someone shed some light on what this sum is, exactly?
It is known that $$\frac{1}{1+u}=1-u+u^2-u^3+\dots\qquad |u|<1.$$
With $u=(ax)^2$, we get $$\frac{1}{1+a^2x^2}=\frac{1}{1+(ax)^2}=1-(ax)^2+(ax)^4-(ax)^6+\dots\qquad |(ax)^2|<1. $$ Equivalently, we get $$\frac{1}{1+a^2x^2}=\sum_{n=0}^\infty (-1)^n(ax)^{2n},\qquad |x|<\frac{1}{|a|}$$