For $1 \leq r < p < \infty$ prove the continuous injection of $L^p([0, 1])$ into $L^r([0, 1])$.
I am having a hard time starting. Any suggestions. I tried a straight forward approach. That is, given $\epsilon > 0$, I tried to find a $\delta >0$ such that $||f - g||_p < \delta$ implies that $||f - g||_r < \epsilon.$
Thanks for any help.
Alternatively you can use Hölder's Inequality to find that $$ \lVert x \rVert_r \le \lVert x \rVert_p \tag{1} $$ and then you can proceed by letting $\epsilon > 0$ and putting $\delta = \epsilon$ so that $$ \lVert x - y \rVert_p < \delta \implies \lVert i(x) - i(y) \rVert_p < \epsilon \implies \lVert i(x) - i(y) \rVert_r < \epsilon $$
Hint for (1): $$ \frac{1}{p/r} + \frac{1}{p/(p-r)} = 1 $$