Question states
Let $f: \mathbf{R} \rightarrow \mathbf{R} $ be the function given by
$f(x)=\frac {x}{4+x^2} $
Give the Taylor series of f around the point $x = 0$ and state it’s radius of convergence
my guess is $n$-th root test but i'm not sure, also please do not use intermediate calculus to solve this problem
A few things may help.
If $g$ has radius of convergence (ROC) $R$ then $x \mapsto x g(x)$ will have the same ROC so we can just look at $x \mapsto {1 \over 4+x^2}$.
Note that if $h(x) = {1 \over 1+x}$ we can write $g(x) = {1 \over 4}h(({x \over 2})^2)$, so if $h$ has ROC $R$ then we know that $g$ has ROC of at least ${R^2 \over 4}$.
(Full disclosure: It is possible, since $x^2 \ge 0$, that the ROC of $g$ could be larger.)
The expansion of $h$ about $x=0$ is $1-x+x^2-x^3+\cdots$ with a ROC of $1$, and in this case, if $|x|=1$ the series does not converge, so the previous parethetical remark does not apply.
The series for $g$ follows from the series for $h$.