Why do we multiply the power of the polynomial when deriving them?

92 Views Asked by At

Mainly I am thinking about second degree polynomials.
For instance
$f(x)=2x^2$
$f'(x)=4x$
or
$f(x)=ax^n$
$f'(x)=a*n*x^{n-1}$
What I want to know is why we multiply n in front.

From my understanding the derivative is what x is increasing with so I would think that in
$x^2$
x is increasing with x per x because
$x^2=x*x$
and the derivative becomes just x. Like how in
$2x$
x is increasing with 2 per x because
$2x=2*x$
and the derivative becomes just 2

I am also deeply sorry if this way of thinking is making someone cringe, for instance like my maths teacher, but it makes sense to me.

I already know some proofs like how the area of a second degree polynomial derivative is a triangle and that's why you have to divide by two if you want to find the integral, but it just doesn't make much sense the other way around to me.
I can also probably think of some complex algebraic way of proving this with the definition of the derivative, but I just want a logical explanation of the correlation. Like how I showed with my stupid way of thinking.

2

There are 2 best solutions below

1
On BEST ANSWER

Clearly, $$f(x)*g(x) = g(x)*f(x)$$ Thus, $$(f(x)*g(x))' = (g(x)*f(x))'$$ Thus, it cannot be the case that $$(f(x)*g(x))' = f(x)'*g(x)$$ Replace both $f(x)$ and $g(x)$ by $x$, and we see that it cannot be that $$(x*x)' = x'*x = x$$ It we don't break the symmetry between $f$ and $g$, and use the correct product rule, $$(f(x)*g(x))' = f(x)'*g(x) + f(x)*g(x)'$$ we see that $$(x*x)' = x'*x + x*x' = x + x = 2x$$

(ah, but why isn't the product rule $(f(x)*g(x))' = \frac{1}{2}f(x)'*g(x) + \frac{1}{2}f(x)*g(x)'$? That would also have the right symmetry!)

0
On

The answer is that the derivative of a function is defined as the gradient of the tangent to the function, which you can write as $$f'(x) = \lim_{h \rightarrow 0} \frac{f(x + h) - f(x)}{h}$$

When $f(x) = a x^n$, we can calculate this directly as:

$$\begin{eqnarray} f'(x) & = & \lim_{h \rightarrow 0} \frac{f(x + h) - f(x)}{h} \\ & = & \lim_{h \rightarrow 0} \frac{a(x + h)^n - a x^n}{h} \\ & = & \lim_{h \rightarrow 0} a \frac{(x^n + n x^{n - 1} h + O(h^2)) - (x^n)}{h} \\ & = & \lim_{h \rightarrow 0} a \left(n x^{n - 1} + O(h) \right) \\ & = & a n x^{n - 1} \end{eqnarray}$$

where the $O(h)$ terms are multiples of $h$ and hence vanish in the limit.

In particular, with $f(x) = x^2$, notice that $(x + h)^2 - x^2 = (x + h - x)(x + h + x) = h(2x + h)$, so if you increase $x$ slightly, then $x^2$ increases proportional to $2x$ (plus a little bit more).