This problem is proposed for programming purpose. Let $f(x)$ and $g(x)$ be two given polynomials. I'd like to calculate the coefficients of the series expansion of $\frac{f(x)}{g(x)}$ under two conditions, respectively. How to obtain the coefficients efficiently?
a) I'm happy with the coefficients having a given number of significant digits;
b) I'm happy with the coefficients modulo a given number, say $M$.
Edit: I notice that the coefficients may satisfy some linear recurrence relationship.
Oh man, there is a really cool answer to this! First I will show how to compute arbitrarily many coefficients of $\frac1{f(x)}$. You can then compute $f(x)/g(x)$ by multiplying $f(x)$ with a large enough Taylor expansion of $1/g(x)$. I got this information from Modern Computer Algebra 3ed by Joachim von zur Gathen and Jurgen Gerhard, p. 259.
First of all, you need to assume that the constant term of $f$ is $1$. If it is zero, then fix this by factoring out a power of $x$ so what remains has a nonzero constant term, and then factor out whatever that constant term is so what remains has a constant term of $1$. The factored out terms must be factored back in at the end.
From now on, assume $f(0)=1$. Define the following sequence of polynomials, $g_0,g_1,\dots$ recursively by
\begin{align} g_0 &= 1\\ g_i &= 2g_{i-1}-fg_{i-1}^2\pmod{x^{2^i}} \tag{$i\ge 1$} \end{align}
You can then show that the first $2^n$ coefficients of $g_n$ will agree with that of the Taylor series of $1/f(x)$. This means that you only need $\log n$ steps to get $n$ digits of $1/f(x)$.
You can ignore the $\pmod{x^{2^i}} $ part if you like, but this offers a speed-up opportunity. It says that during the $i^{th}$ step, you only need to keep track of the first $2^i$ coefficients of $g_i$, so you only need the first $2^{i}$ coefficients of $f$ to compute $g_i$.
Finally, if you use FFT polynomial multiplication to compute $g_i$, this will all in all only take $O(n\log n)$ arithmetic operations to get the first $n$ coefficients of $1/f(x)$.