When I ask Mathematica to compute the series of $f(x)=\frac{\cos(x)}{\sin(x)}$ around $0$, I get:
$$\frac{1}{x}-\frac{x}{3}-\frac{x^3}{45}-\frac{2 x^5}{945}-\frac{x^7}{4725}-\frac{2 x^9}{93555}+\dots\tag{$\star$}$$
In the case of this series, we can't just compute the derivatives and take $a=0$ because the denominator vanishes, so we must appeal to some kind of trick. I just learned that we can write:
$$\frac{\cos(x)}{\sin(x)}=p(x) \implies \cos(x)=\sin(x) p(x)$$
Where $p(x)=c[0]+c[1]x^1+\dots$ is a polynomial, and then we expand the series of $\cos(x)$ and $\sin(x)$, multiply $\sin(x)$ by $p(x)$, collect the coefficients and compare them with the coefficients of the expansion of $\cos(x)$.
But in this case, it seems that something weird happens: When I expand $\cos(x)$ I get the coefficients of $x^n$ of the first column, when I expand $\sin(x)$, multiply by $p(x)$ and collect the coefficients , I get the coefficients in the second column:
$$ \begin{array}{c} 1 \\ 0 \\ -\frac{1}{2} \\ 0 \\\vdots \end{array} \hspace{4cm} \begin{array}{c} 0 \\ c(0) \\ c(1) \\ c(2)-\frac{c(0)}{6} \\\vdots \end{array}$$
So there is no coefficient of $x^0$ to be compared. I thought about writing it as
$$\frac{\cos(x)}{\sin(x)}=\frac{1}{p(x)} \implies p(x) \cos(x)=\sin(x) $$
Now we have (ie: Now it's possible to compare all coefficients):
$$ \begin{array}{c} 0 \\ 1 \\ 0 \\ -\frac{1}{6} \\\vdots \end{array}\hspace{4cm} \begin{array}{c} c(0) \\ c(1) \\ c(2)-\frac{c(0)}{2} \\ c(3)-\frac{c(1)}{2} \\\vdots \end{array} $$
And this "series" "approximates pretty well" in "small" interval:

But I still would like to know how it's possible to expand the series of $f$ and obtain $(\star)$ and know if there is some general "trick" where we can expand series in points where the denominator vanishes.
EDIT: In the comments below, I kinda remembered about Laurent series. But even with this, I don't get what algebraic manipulation can be done so that we can expand $f$ into $(\star)$.
Your basic idea isn't flawed, but it does need some care to execute correctly.
If we assume $p$ has some expansion of the form $$p(x) = \sum_{k = -\infty}^\infty a_k x^k,$$ then we want the solutions for coefficients $a_k$ satisfying $$1 - \frac{x^2}{2!} + \frac{x^4}{4!} - \frac{x^6}{6!} + \cdots = \left(x - \frac{x^3}{3!} + \frac{x^5}{5!} - \cdots\right)\left(\cdots + \frac{a_{-2}}{x^2} + \frac{a_{-1}}{x} + a_0 + a_1 x + a_2 x^2 + \cdots\right).$$
It's not difficult to see that for all $k \le -2$, the coefficients $a_k = 0$, since the product on the RHS cannot contain terms of degree less than $0$, which would occur from multiplying the first term in the series expansion of the sine function, which has degree $1$.
We can also see that $a_{-1} = 1$, since the only constant term arising from the product on the RHS is $x \cdot \frac{a_{-1}}{x} = a_{-1}$. And it is in this way that we can recursively calculate additional coefficients, as this in turn implies $$a_0 x = 0, \\ -\frac{a_{-1}}{3!} x^2 + a_1 x^2 = - \frac{x^2}{2!}, \\ -\frac{a_0}{3!} x^3 + a_2 x^3 = 0,$$ and so forth. Working carefully, this gives us $a_0 = 0$, $a_1 = -1/3$, etc., which is the desired expansion.
Now, this is not necessarily how we go about solving for such series expansions in the general case. Laurent expansions, as you mentioned, can also be calculated using residue theory.