Let $g(x)$ be infinite formal power series and $$g(x) = (1 + x)(1 + x^2)\cdots(1 + x^{2^k})\cdots$$ Show that $(1 - x) g(x) = 1$. My book gives following proof:
Using a fact that $(1 - x^k)(1 + x^k)=1-x^{2^k}$
$$ (1 - x) g(x) = (1 - x)(1 + x)(1 + x^2)\cdots(1 + x^{2^k})\cdots = (1 - x^2)(1 + x^2)\cdots(1 + x^{2^k})\cdots =$$ $$(1 - x^4)(1 + x^4)(1 + x^8)\cdots = (1 - x^8)(1 + x^8)(1 + x^{16})\cdots = 1 $$
Each step of reasoning shows that every term is "eaten" by multiplication. I do not see how it equates to 1 and I am not even close to accept it as a formal enough proof. If it was regular power series (numeric not formal) then I see how this proof could be easily turned in formal one by using limit but I do not see why formal power series of this form equates to $1$.
I have added logic tag because I do not understand proof technique used here. It is not any "inifitiary" proof technique which is widely used like limits or induction.
What we really want to understand is why these sorts of formal manipulations are valid. To do so it will suffice to convince ourselves that for all of the coefficients of $x^i$ in $(1-x)g(x)$ where $i<n$ for any fixed $n$ we have a coefficient of $0$. By using the "eating" process you can show that the coefficient of each $x^i$ with $i>0$ in this product is $0$. By inductively repeating the process, you discover all of the coefficients of the formal power series, so you know what the formal power series is. Then, note that every time we "ate" the previous term, the constant term, $1$, is preserved, so, inductively, we see that the constant term will always be $1$, and thus that $(1-x)g(x)=1$.