Prove a statement involving differentiable function

43 Views Asked by At

Let $f$ be continuous on $[a,b]$ and differentiable on $(a,b)$, where $0<a<b$, and let $\lambda\in(0,1)$. It is given that $f(a)=a, f(b)=b$ and $f'(x)\neq 0$ for all $x\in(a,b)$.

Show that there exists $\alpha$ and $\beta$ in $(a,b)$, with $\alpha<\beta$, such that $\frac{1-\lambda}{f'(\alpha)}+\frac{\lambda}{f'(\beta)}=1$.

I sense I need to employ Mean Value Theorem somehow, but do not know exactly how I should invoke them. Could someone give me some hint?

2

There are 2 best solutions below

0
On BEST ANSWER

Hint : let $k= \lambda a + (1-\lambda) b$, which lies between $a$ and $b$. Then we can find $c\in (a, b)$ with $f(c) = k$. Now apply MVT twice to the each interval $[a, c]$ and $[c, b]$.

0
On

Hint: Suppose the graph of $f$ is linear. In this case, can we guarantee there exists some $\alpha$ and $\beta$ such that $f^\prime(\beta) = f^\prime(\alpha) = \frac{f(b) - f(a)}{b-a} = 1$. Why would this be guaranteed? Why is it useful?

Okay, now what about when $f$ is not linear? How close to a slope of $1$ can we get? Maybe try $f^\prime(\alpha) = 1-\epsilon$ for some small $\epsilon$. Use $\frac{1-\lambda}{f^\prime(\alpha)} + \frac{\lambda}{f^\prime(\beta)} = 1$ to solve for $f^\prime(\beta)$.