I came across the following question.
Let $c \in \mathbb{R}$ and let $f: \mathbb{R} \to \mathbb{R}$ be defined by:
$$ f(x) = \frac{1+ c x^2}{1+ x^2}$$
Let $c \neq 1$. Determine for $a=0$ the convergence radius of the Taylor series of $f$ in the point $a$. What happens when $c=1$?
I tried to write down the Taylor series, the first derivative is easy but then it gets trickier.
$$ f'(x) = \frac{2x(c-1)}{1+ 2x^2 + x^4}$$
How can you determine the convergence radius of a Taylor series, especially if you cannot find a general formula for the series.