Let $f$ be continuous on $[a,b]$ and differentiable on $(a,b)$, where $0<a<b$, and let $\lambda\in(0,1)$. It is given that $f(a)=a, f(b)=b$ and $f'(x)\neq 0$ for all $x\in(a,b)$.
Show that there exists $\alpha$ and $\beta$ in $(a,b)$, with $\alpha<\beta$, such that $\frac{1-\lambda}{f'(\alpha)}+\frac{\lambda}{f'(\beta)}=1$.
I sense I need to employ Mean Value Theorem somehow, but do not know exactly how I should invoke them. Could someone give me some hint?
Hint : let $k= \lambda a + (1-\lambda) b$, which lies between $a$ and $b$. Then we can find $c\in (a, b)$ with $f(c) = k$. Now apply MVT twice to the each interval $[a, c]$ and $[c, b]$.