I just want to understand why the following statement are true :
Let $(f_n)$ be a sequence of continuous function , uniformly bounded over [$0,1$] and orthonormal with respect to scalar product $( \,,\, ) $ in $L^2[0,1]$
We define : $F_n(x)=\int_{0}^{x}f_n(t)dt $ s.t $x\in [0,1]$
- There is no subsequence of $(f_n)$ that converges in $L^2[0,1]$
- We can find a subsequence of $F_n$ that converges in $L^2[0,1]$
I don't see why we can found for $f_n$ and not for $F_n$
Thanks in advance for your help .
is nearly trivial, because $\|f_n-f_m\|^2=\|f_n\|^2-2\,(f_n,f_m)+\|f_n\|^2=2$ for $n\neq m$, and that's true for any subsequence as well, the contrary of a Cauchy sequence.
is a consequence of the celebrated Arzelà–Ascoli theorem (https://en.wikipedia.org/wiki/Arzel%C3%A0%E2%80%93Ascoli_theorem): since the $f_n$ are uniformly bounded, $F_n$ are equicontinuous, so $\{F_n\}$ is a (pre-)compact set in $C[0,1]$. Thus, there is a convergent subsequence in $C[0,1]$, and that's convergent in $L^2[0,1]$ as well.