Question about divergences of functions Vs divergences of their expansions

23 Views Asked by At

I have the following function $$f(a,x)=\frac{1}{(1-a\sqrt{1-x^2})}$$ For small x (i.e. $x<<1$), I can Taylor expand around $x=0$, according to $$f(a,x)=\frac{1}{1-a}-\frac{ax^2}{2(1-a)^2}+\mathcal{O}(x^4)$$ Now, for $a=1$, the first expression reduces to $$f(a=1,x)=\frac{1}{(1-\sqrt{1-x^2})}$$ which does not diverge if $x$ is small, but finite, correct? Why does the Taylor expanded version diverge, then, for $a=1$, but for $x$ finite and small?

Any help will be appreciated.

1

There are 1 best solutions below

4
On BEST ANSWER

Because as $x \to 0$, $f(1,x) \to \infty$ (the denominator goes to $0$) and $f(1,0)$ does not exist at all. For a Taylor series around $x=0$ to exist at all, you need the function and all its derivatives to be defined at $x=0$.