An exercise about orthogonal basis from A.Friedman's book-foundation of modern analysis

64 Views Asked by At

The exercise 6.4.6 from A.Friedman's book-foundation of modern analysis: Show that an orthonormal sequence $\{e_n\}_{n\ge 1}$ is complete in $L^2([a,b])$ if $$\sum_{i=1}^\infty (\int_0^x e_n(t)dt)^2 =x-a$$ for all $x\in [a,b]$.

I have think over this question, but can not deal with it. Can some kind man help me?

1

There are 1 best solutions below

3
On BEST ANSWER

Suppose $\{e_n\}$ is not complete. Extend $\{e_n\}$ to an orthonormal basis $\{e_n\} \cup \{g_n\}$. Let $f_x=\chi_{(a,x)}$. Then $$||f_x||^{2}=\sum \langle f_x , e_n\rangle ^{2} + \sum\langle f_x , g_n\rangle ^{2}$$. The hypothesis gives $||f_x||^{2}=\sum \langle f_x , e_n\rangle ^{2} $. Hence $\sum \langle f_x , g_n\rangle ^{2}=0 $. Hence $\int_0^{x} g_n(t) dt=0$ for all $n$ and all $x$. It follows that $g_n=0$ almost everywhere for each $n$ which contradicts the fact that $||g_n||=1$ for each $n$. PS: I have given the proof for real scalars but you only have to add some absolute value signs for the complex case.