I have a function which is of the form \begin{equation} f(x) = \frac{1 - x^{1/2} + x - x^{3/2} + \ldots}{1+x^{1/2} - x + x^{3/2} - \ldots}. \end{equation} Intuitively, I would assume that for small $x$, it holds \begin{equation} f(x) \approx \frac{1-x^{1/2}}{1+x^{1/2}} \end{equation} and then, furthermore, \begin{equation} f(x) \approx 1 - a x^{1/2} + \ldots \end{equation} where $a$ is some factor. My question is: How can I determine $a$ and the range of $x$ for which this approximation is valid? Obviously, I cannot use a Taylor approximation since $f$ is not analytic and the derivative diverges in the origin.
Let me point out that I am not so much interested in the specific example above, which I have just invented. Much rather, I would like to know what is the general theory and methods behind this type of fractional functions.
Another way to treat this particular sort of problem. It is an analytic function of, say, $z$, where $z=x^{1/2}$.