Proving Subspace of $L^2$ and orthogonal complement

749 Views Asked by At

Let $L_0 = \left\{ \varphi \in L^2 [-a,a] : \varphi(t) = -\varphi (-t) \right\}$ and $L_E = \left\{ \varphi \in L^2 [-a,a] : \varphi(t) = \varphi(-t) \right\}$

I want to show that $L_0$ and $L_E$ are subspaces of $L^2 [-a,a]$ and that $L_E$ is the orthogonal complement of $L_0$. I know to show that they are subspaces that I need to show that they are closed under addition and scalar multiplication with respect to the $L^2$ space. However, the computational part of this proof is giving me trouble. Since, $$L^2 [-a,a] = \left\{ f : [-a,a] \to \mathbb{C} \space\ \bigg| \space\ \int_{-a}^{a} |f(x)|^2 dx < \infty \right\}$$ We have to show that for $f,g \in L_0$ and $\lambda \in \mathbb{C}$ that $$\int_{-a}^{a} |(f+\lambda g)(x)|^2 dx = \int_{-a}^{a} f^2 (x)dx + \lambda \int_{-a}^{a} g^2(x) dx$$ I get that $$ \int_{-a}^{a} |(f+\lambda g)^2 (x)| dx = \int_{-a}^{a} |f^2(x) + \lambda fg(x) + \lambda gf(x) + \lambda^2 g^2(x)| dx$$ I was thinking that maybe I could get the two middle terms to cancel using that $f(x) = -f(-x)$ and $g(x) = -g(-x)$ but I couldn't get it. I'm basically having the same trouble with $L_E$ subspace.

1

There are 1 best solutions below

0
On

I'm assuming your underlying space is real, which you seem to be doing as well. All linear combinations of odd functions are odd, and all linear combinations of even functions are even; so $L_E$ and $L_0$ are linear subspaces. Suppose that $f\in L_E$ and $g\in L_O$. Then $fg$ is odd, which gives $\int_{-a}^{a}fgdt = 0$. So $L_E \perp L_O$. And every $f \in L^2$ can be written as $$ f = f_e + f_o, \;\;\; f_e \in L_E,\; f_o \in L_O, $$ where $$ f_e(t) = \frac{1}{2}(f(t)+f(-t)),\;\; f_o(t)=\frac{1}{2}(f(t)-f(-t)). $$ So $L_E \perp L_O$ and $L^2 = L_E\oplus L_O$ is an orthogonal direct sum decomposition. Because of this decomposition, it is automatic that $L_E^{\perp}=L_O$ and $L_O^{\perp}=L_E$.