Real analysis problem: Rolle's theorem, Darboux's theorem, induction

535 Views Asked by At

I am taking a course in real analysis (undergraduate) and I'm supposed to solve a problem and present in class. I've just started, but I'm already stuck. This is the problem:

Let a function $f$ be continuous on $[0,1]$ and differentiable on $(0,1)$. Assume that $f(0)=f(1)$. Prove that for any $n\in N$ there exist points $0<c_1<...<c_n<1$ such that $\sum_{k=1}^nf'(c_k)=0$. Hint: apply Rolle's theorem, induction and Darboux's theorem.

This is what I have so far:
Let $g(x):=f(x)-f(1)$, then $g$ is continuous on $[0,1]$ and differentiable on $(0,1)$. $g(1)=g(0)=0$ so Rolle's theorem gives that there exists a point $c_1\in (0,1)$ s.t. $g'(c_1) =f'(c_1) =0$.

Darboux's theorem requires that a function is differentiable on a closed interval so I need some point $c$ s.t. $0<c_1<c<1$ so that the function is differentiable on $[c_1,c]$.

I haven't gotten any further than this and would like advice on how to proceed from here.

Update: I aksed my lecturer and he said that the first part with Rolle's theorem would be the first step in the induction proof. So then I would assume that there exist $n$ points $0<c_1<...<c_n<1$ s.t. $\sum_{k=1}^n f'(c_k)=0$ and then use that to prove that for $n+1$ points $0<c_1<...<c_n<c_{n+1}<1$ we also have that $\sum_{k=1}^{n+1}f'(c_k)=0$.
$\sum_{k=1}^{n+1}f'(c_k)=\sum_{k=1}^nf'(c_k)+f'(c_{n+1}) = f'(c_{n+1})$. If $f'(c_{n+1})=0$ then the problem is solved. But what if $f'(c_{n+1})>0$ or $f'(c_{n+1})<0$?
I guess this is where I'm supposed to use Darboux's theorem, but I'm not sure how.

2

There are 2 best solutions below

5
On

Okay this one does not use induction and there are a couple of things that need better justification, but I though I'll just share it :-)

First, if $f'(x) = 0$ then $f(x)$ is constant and you can choose $n$ random points.

If $f'(x) \neq 0$, it can't be $f'(x) \ge 0$ (or $\le 0$) because otherwise $f(1) > f(0)$ ($< 0$) (to prove this you need Darboux theorem; the point is that it can't be $f'(x) = 0$ "almost always" with a few isolated points that are $>0$ because of the intermediate value property)

So $f'(x)$ changes sign somewhere; so there are two points, call them $d_1$ and $d_2$, such that $f'(d_1) > 0$ and $f'(d_2) < 0$

Now define the function $$S(c_1, \dots, c_n) = \sum_{i=1}^n f'(c_i)$$

This function has the intermediate value property because $f'(x)$ does. And since $$S(d_1, d_1, \dots, d_1) = \sum_{i=1}^n f'(d_1) = nf'(d_1) > 0$$ and $$S(d_2, d_2, \dots, d_2) = \sum_{i=1}^n f'(d_2) = nf'(d_2) < 0$$

there have to be (again, intermediate value property) $n$ points $c_1^*, \dots, c_n^*$ such that $$S(c_1^*, \dots, c_n^*) = \sum_{i=1}^n f'(c_i^*) = 0$$ which is what we wanted to prove.

4
On

I'd observe that $0=f(1)-f(0) = \sum_{k=1}^n\left[f(k/n)-f((k-1)/n)\right]$ and then apply the MVT on each interval $[(k-1)/n,k/n]$.