I'm doing an online course on Enumerative Combinatorics. In the chapter on formal power series and nonlinear recurrence relations, we start off by looking at compositions of formal power series $A(B(q))$ expressed as:
$$\sum_{n\geq0} a_{n}(B(q))^n$$
with $a_n \in A(q)$.
An exercise proposed by the instructor is to determine why computing each term of the above sum takes finitely many steps if $b_{0} \in B(Q) = 0$.
I had a little think and it doesn't seem to me why this would be possible if $b_0 = 0$. Let's take the second term of $A(B(q))$, $a_1B(q)$, and assume $b_0 \in B(q)$ is $0$.
We have $(a_1b_0=0) + a_1b_1q + a_1b_2q^2 + \dots$.
Is this last expression finite somehow?
Let $A(B(q))=\sum_{n\ge 0}c_nq^n$, and say you want to compute $c_n$. The key idea is that the only contributions to $c_n$ come from the terms $$a_0+a_1 (B(q))^1+a_2(B(q))^2+\dots a_n (B(q))^n.$$ Indeed, for any $m>n$, the term $(B(q))^m$ will only have powers of $q$ which are greater than $n$. Therefore, you just need to extract the coefficient of $q^n$ in each of the above terms, which it should be clear can be done in finitely many steps.
We require $b_0=0$, because otherwise the composition is not a well defined power series. For example, let $A(q)=1/(1-q)=1+q+q^2+\dots$, and let $B(q)=1$. Then we would have $A(B(q))=\frac10=1+1+1+\dots$, which is impossible.