Derivative of product of polynomials: why does it suffice to check Leibniz for monomials?

129 Views Asked by At

I'm reading I.N. Herstein's Topics in Algebra, Second Edition. in one of the chapters about polynomials, i was not able to understand a proof of it,

Let $$f(x) = a_0 + a_1x + \dots+ a_n x^n \in F[x]$$ where $F$ is a field, and $$f'(x) = a_1 + 2 a_2x + \dots + n a_n x^{n-1}.$$

Verify that \begin{align} \left(f(x) g(x)\right)' = f'(x)g(x) + f(x) g'(x). \label{op.eq.leibniz} \tag{1} \end{align}

To prove this, it is written, "It's enough only to consider the special case $f(x) = x^i, g(x)= x^j$". I don't understand why they don't have to prove it generally.

1

There are 1 best solutions below

0
On

The map $\Phi\colon F[x]\times F[x]\to F[x]$, $(f,g)\mapsto (fg)'-f'g-fg'$ is bilinear (i.e., clearly(?), $\Phi(cf,g))=c\Phi(f,g)$ for $c\in F$, and $\Phi(f_1+f_2,g)=\Phi(f_1,g)+\Phi(f_2,g)$, and the corresponding equalities for the second parameter follow from the symmetry of $\Phi$, $\Phi(f,g)=\Phi(g,f)$). We want to show that $\Phi(f,g)=0$ for all $f$ and $g$. Now since each $f$ (and $g$) is a linear combination of monomials $x^i$ and $\Phi$ is bilinear, it suffices to show that $\Phi(x^i,x^j)=0$ for all $i,j$: $$ \Phi(f,g)=\Phi(\sum_i a_ix^i,\sum_j b_jx^j)=\sum_{i,j}a_ib_j\underbrace{\Phi(x^i,x^j)}_{=0}=0.$$