Show that there is a constant $C$ so that $$\left| \frac{f(0)+f(1)}{2} - \int_0^1 f(x) \;dx \right| \le C \int_0^1 |f''(x)|\;dx$$ for every $C^2$ function $f: \mathbb{R} \to \mathbb{R}.$
For $x \in [0,1]$ we can write $f(x) = f(0) +f'(0)x + O(x^2)$ and $f(x) = f(1) + f'(1)(1-x) +O(x^2).$ Combine to get $f(x) = \frac12 [f(0)+f(1) -x(f'(1)+f'(0)) +f'(1)] +O(x^2)$ so \begin{align*} \left|\frac{f(0)+f(1)}{2} - \int_0^1 f(x) \;dx \right| &= \left|\frac12\int_0^1 x(f'(1)-f'(0))+f'(1) +O(x^2)dx \right|\\ &= \left| \frac12 \int_0^1 x \int_0^1f''(t)dt +f'(1) + O(x^2)dx.\right|\end{align*}
I get the term I want, but $\{f'(1) \;| \; f \in C^2(\mathbb{R})\}$ is not bounded so this estimate is too crude. How can I improve it to get the desired bound?
You can isolate $\int_0^1 |f''(x)| dx$ by integrating by parts twice, passing derivatives onto $f$. Thus:
\begin{align} \int_0^1 f(x) dx & = (1+C_1) f(1) - C_1 f(0) - \int_0^1 (x+C_1) f'(x) dx \\ & = (1+C_1) f(1) - C_1 f(0) \\ & - \left ( \left ( \frac{1}{2} + C_1 + C_2 \right ) f'(1) - C_2 f'(0) - \int_0^1 \left ( \frac{1}{2} x^2 + C_1 x + C_2 \right ) f''(x) dx \right ). \end{align}
This holds for any $C_1,C_2$. You can choose $C_1,C_2$ such that the terms involving $f$ give you the one-subinterval trapezoidal rule and the terms involving $f'$ vanish. You will find that then the last term is of the form $\int_0^1 g(x) f''(x) dx$ where $g$ does not depend on $f$, so $C=\max_{x \in [0,1]} |g(x)|$ is sufficient.