Let $R$ be a commutative ring with unity element 1. Let $f(x)\in R[x]$ and define its derivative as $f'(x)=r_1 +2(r_2)x+...+n(r_n)x^{n-1}$. Prove that $(f+g)'(x)=f'(x)+g'(x)$ and that $(fg)'(x)=f'(x)g(x)+f(x)g'(x)$
So im pretty sure I worked out the addition, but am struggling with the multiplication.
Let $f(x),g(x)\in R[x]$ with $f(x)=r_0+r_1x+r_2x^2+...+r_nx^n$ and $g(x)=s_0+s_1x+s_2x^2+...+s_nx^n$
$$(f+g)'(x)=((r_0+s_0)+(r_1+s_1)x+...+(r_n+s_n)x^n)'(x)$$ $$=((r_1+s_1)+2(r_2+s_2)x+...n(r_n+s_n)^{n-1}$$ $$=((r_1+s_1)+(2r_2+2s_2)x+...+(nr_n+nr_n)x^{n-1}$$ $$=f'(x)+g'(x)$$
For the multiplication, this is what I've got so far...
$$(fg)'(x)=((r_0s_0)+(r_0s_1+r_1s_0)x+...+(m_j)x^n)$$ where $$m_j=\sum_{i+k=j}^n r_is_k$$ for $j=0,1,2,...$,
$$(fg)'(x)=((r_0s_1+r_1s_0)+2(r_0s_2+r_1s_1+r_2s_0)x+...+n(m_j)x^{n-1}$$ $$=((r_0s_1+r_1s_0)+(2r_0s_2+2r_1s_1+2r_2s_0)x+...+nm_jx^{n-1}$$
This is where I lose it, if I'm even correct up to this point...
First show that the Leibniz rule works for monomials. That is, let $f(x) = rx^a$ and $g(x) = sx^b$. Then $(fg)(x) = rsx^{a+b}$ so that \begin{align*}(fg)'(x) &= (a+b)rsx^{a+b-1} \\ &= arx^{a-1}sx^{b} + bsx^{b-1}rx^{a} \\&= f'(x)g(x) + g'(x)f(x).\end{align*} Since you have shown that the addition rule works on finite sums, note that $(fg)(x) = \sum_{i=0}^n \sum_{j=0}^m f_i(x)g_j(x)$ is a finite sum of monomial products (where $f_i(x) = r_ix^i$ and $g_j(x) = s_jx^j$). Thus, \begin{align*}(fg)'(x) &= \sum_{i=0}^n \sum_{j=0}^m f_i'(x)g_j(x) + g_j'(x)f_i(x) \\&= f'(x)g(x) + g'(x)f(x).\end{align*} This last equality holds because, evidently, $f'(x) = \sum_{i=0}^n f_i'(x)$ and similarly $g'(x) = \sum_{j=0}^m g_j'(x)$, so that rearranging terms in the finite sum yields the desired result.