Let $u\in C^2([0,1])$ and $\epsilon >0$ arbitrary. Show that there exists an absolute constant C>0 such that
$$ \int_0^1 |u'|^p dx \leq \epsilon^p \int_0^1 |u''|^p dx + \frac{C}{\epsilon^p} \int_0^1 |u|^p dx. $$
Hint. Consider an interval $(a,b)\subset(0,1)$ of length $\epsilon$ (same as above). Use the mean value theorem for a bound of $|u'(x)|$ for some $x\in(a,b)$, depending on $\epsilon, u(x_1), u(x_2), x_1, x_2\in(a,b)$. Then use the fundamental calculus theorem on $u'$ to obtain an estimate of $|u'(x)|$ for any $x\in(a,b)$. Integrate this estimate with respect to $x_1,x_2$, raise everything to power p and apply Hölder's inequality to obtain an estimate of $|u'(x)|$ in terms of the integral quantities on the RHS of the desired result.
You are almost on the right track. Following the hint, we find $\hat x \in (a,b)$ with $b - a = \epsilon$, such that $$u'(\hat x) = \frac1\epsilon \, (u(b) - u(a)).$$ By the fundamental theorem of calculus, you find $$u'(x) = \int_{\hat x}^x u''(t) \, \mathrm{d}t + u'(\hat x) = \int_{\hat x}^x u''(t) \, \mathrm{d}t + \frac1\epsilon \, (u(b) - u(a)).$$ In particular, we get the estimate $$|u'(x)| = \int_a^b |u''(t)| \, \mathrm{d}t + \frac1\epsilon \, |u(b) - u(a)| \quad \forall x \in [a,b].$$ Now, you can set $a = x$, $b = x+\epsilon$, use some elementary calculations, and integrate from $0$ to $1-\varepsilon$ (the remaining part can be handled analogously by $a = x-\varepsilon$, $b = x$).