I am asked to show that if $f$ is uniform continuous on the closed and bounded interval $[a,b]$, then the same holds for any subset interval of $[a,b]$.
But isn't this obvious? If our subset is $[c,d]$ then if I assume that it is not uniform, meaning there exist an $\epsilon$ such that no matter how close I make my points in $[c,d]$, $|f(x) - f(y)| \ge \epsilon$ for some x and y $\in [c,d]$.
But, $x$ and $y$ will always be elements of $[a,b]$ and we know for a fact that no matter what $\epsilon$ is, there is always a $\delta$ such that if we make $x$ and $y$ within delta, we must have $|f(x) + f(y)| < \epsilon$, and this is a contradiction?