Uniform Convergence of integrals of sequences of functions

176 Views Asked by At

Just looking for feedback on if I am thinking about this correctly:

Let {$f_n$} be a sequence in $\mathscr{R}[a,b]$ (the set of Riemann integrable functions on $[a,b]$, not sure if this is standard notation or not) that converges uniformly to $f\in\mathscr{R}[a,b]$. For n $\in \mathbb{N}$ set $F_n(x)$ = $\int_{0}^{x}f_n$, and let $F(x)$ = $\int_0^xf$, $x \in [a,b]$. Prove that {$F_n$} converges uniformly to $F$ on $[a,b]$.

So my approach has been to show that $|\int_{0}^{x}f_n - \int_{0}^{x}f|$ = $|\int_{0}^{x}(f_n - f)|$, and that since $f_n$ converges uniformly to $f$, then the difference will be quite small, for $n$ sufficiently large, and all $x\in[a,b]$. This will allow me to find a proper epsilon to bound it by, and declare uniform convergence of the integral. I still haven't thought up a good way to write it, but I feel pretty good on the concept.

The thing that is troubling me is that we are given $[a,b]$, and asked to integrate over $[0,x]$, when $0$ might not be in the given interval. Is there something I'm missing, like is this trivially obvious, or is it possibly a typo?