Q. Let $f:[a,b] \to \mathbb{R}$ be bounded and let $\{P_n\}$ be a limiting sequence of partitions of the interval $[a,b]$ i.e. $\lim_{n\to \infty} ||P_n|| = 0$ Prove that:
$\displaystyle\lim_{n\to \infty} L(f,P_n) = \underline{\int_a^b}f(x)\ dx$ and $\displaystyle\lim_{n\to \infty} U(f,P_n) = \overline{\int_a^b}f(x)\ dx$
my attempt,
using Darboux theorem (proved), we know that since $||P_n|| \to 0$ choose $\delta > 0$ s.t. $||P_n|| < \delta$, then we know that this implies that for some $\epsilon > 0$
$ \displaystyle \underline{\int_a^b} f(x) \ dx - \epsilon < L(f,P_n) \leq \underline{\int_a^b} f(x) \ dx$
$ \displaystyle \overline{\int_a^b} f(x) \ dx \leq U(f,P_n) < \displaystyle \overline{\int_a^b} f(x) + \epsilon $
now my question is can we simply just "take the limit" of the inequality and use the sandwich theorem to imply the result? How would I justify the result from this inequality?