Induction with Rolle's Theorem Question (Solution Attempt)

59 Views Asked by At

I am attempting to solve a problem which goes as follows:

Suppose $f$ is differentiable on $\mathbb{R}$ and that $f$ has $n$ distinct real roots. Prove that $f^{'}$ has at least $n-1$ distinct real roots.

Now this question is very clearly hinting at using Rolle's theorem and my attempted solution did just that:


We know by Rolle's Theorem that between each pair of roots of a function, we'll denote as $a,b$, $\exists$ some $c\in(a,b)$ s.t. $f^{'}(c)=0$. So, let $r_i$ denote the $i^{\text{th}}$ root of $f$ s.t. $i\in{1,2,...,n}$ and $r_i<r_{i+1} \forall$ $i$. Now, we know the interval $[r_1,r_n]$ can be broken up into $n-1$ disjoint subintervals, each of which will be bounded by the subsequent root of $f$, i.e. $[r_1,r_n]=[r_1,r_2]\cup[r_2,r_3]\cup\dots\cup [r_{n-1},r_n]$. My last step would've been to prove that each of these intervals contain at least one point where $f^{'}(c_i)=0$ s.t. $c_i\in(r_i,r_{i+1})$ and do this inductively (using Rolle's Theorem) to finally prove that $f^{'}$ has at least $n-1$ roots.

My question lies in the induction step. Is this an appropriate application of induction? I've usually only seen induction used when there is a sort of meaningful progression of logical arguments, whereas here I feel as thought I will be restating the Rolle's Theorem at each interval. Would it be better (more correct?) to apply the induction on the number of roots rather than the intervals which I created?