Assume $f$ ∈ $C([a, b])$ is twice continuously differentiable and $f$ ''(x) > 0 on [a, b]. Show that the best linear approximation (polynomial of degree one) $p$ to $f$ has the slope
$p$'(x) = ($f$(b) - $f$(a)) / (b-a).
I've realized that since the second derivative is always greater than zero on the interval, the function must be concave up the whole time, which implies that the endpoints of the interval must be the max and min of the function. I know that to make $p$ the best approximation, I have to pick it so that it minimizes the sup norm |.|∞ so that |$f$ - $p$| = inf { |$f$ - $q$| : $q$ ∈ $C([a, b])$ }.... But I've been looking at it for hours and not written anything down...