Questions about Approximating a Function at Infinity

87 Views Asked by At

I was playing around today and happened across a nice method for approximating functions at infinity. The first example of a function I found that my method worked for was

$$f(x) = \frac{1}{1+x^2}$$

The method is outlined as follows. Define an auxiliary function $g$ as

$$g(x) := f\left(\frac{1}{x}\right)$$

Since $g$ is analytic everywhere except possibly at the point $0$, we can express $g$ as

$$g(x) = \sum_{n=0}^\infty\frac{g^{(n)}(x_0)}{n!}(x-x_0)^n$$

for any point $x_0 \neq 0$. Now define $f_a$ as

$$f_a(x) := \lim_{x_0 \to 0}\sum_{n=0}^\infty\frac{g^{(n)}(x_0)}{n!}\left(\frac{1}{x}-x_0\right)^n$$

I truncated the series definition of $f_a$ at the third term and took $x_0 = 0.0001$. Then I plotted the resulting graph (Dotted Blue) and compared it to the graph of $f$ (Solid Black) as shown below.

So we can see that "$f_a$ approximates $f$ at infinity", in some sense. I suspect that this method is related to asymptotic expansions, but I am having trouble formalizing the ideas presented here. My two questions are:

Formally, what is the process that I am getting at here?

For what class of functions can we expect this method to work? Being analytic is a pretty obvious requirement, but the method also seemed to fail for functions like $\dfrac{\sin(x)}{1+x^2}$.

Moreover, if anyone has some online references that they could point me to for more information, that would be appreciated as well.