I was doing this problem in Rudin's book. 
And I have a question: Could we proceed the similar problem below by a similar way
If $f_m \geq 0$ on $[0,T]$, $f_m$ is continuous on $[0,T]$ and $\int_0^T f_m(t)dt \to 0$ as $m \to \infty$ then $f_m(t)\to 0$ as $m \to \infty$.
Is this problem true? What do you think?
Thank you so much.
A standard counterexample would be $f_m(t) = t^m$ on the interval $[0,1]$.
Clearly, $$\int_0^1 f_m(t)dt = \frac 1m \to 0 \text{ as } m\to\infty,$$ yet $$\forall m \quad f_m(1)=1.$$
Another, more striking counterexample would be $$f_m(t) = \begin{cases} m-xm^3, &x \in [0,1/m^2],\\0, & x\in [1/m^2,1] \end{cases}$$ These functions are non-negative continuous, their integrals over $[0,1]$ are equal to $\frac {1}{2m}$, yet $f_m(0)\to\infty$ as $m\to\infty$.
In other words, if $f_m$ converges to $0$ in the sense of the Lebesgue space $L^1(0,T)$, we can not guarantee that $f_m$ converges to zero pointwisely or uniformly on $[0,T]$. Fatou's lemma and dominated convergence theorem are classic results for this type of questions.