Sum of Fourier coefficients less than the norm of the respective vector

360 Views Asked by At

I have the following question to complete.

Let $X$ be an inner product space. Let $(e_{j})_{j\geq1}$ be an orthonormal sequence in $X$. Show that, \begin{align} \sum_{j=1}^{\infty}|(x|e_{j})(y|e_{j})|\leq\|x\|\|y\|, \end{align} for all $x,y\in X$.

I have tried to use the Cauchy-Schwartz Inequality.

\begin{align} \sum_{j=1}^{\infty}|(x|e_{j})(y|e_{j})|\leq\|x\|\|y\|\sum_{j=1}^{\infty}\|e_{j}\|^{2}. \end{align} However, that remaining sum, as far as I know, does not converge.

I tried to use Parseval's Identity, but that didn't work either. Can someone offer me a hint?

2

There are 2 best solutions below

1
On BEST ANSWER

Use Cauchy -Schwraz inequlairty for sequences rather than inner product. $|\sum \langle x,e_i \rangle \langle y,e_i \rangle| \leq (\sum |\langle x,e_i \rangle|^{2})^{1/2}(\sum |\langle y,e_i \rangle|^{2})^{1/2} \leq \|x\|\|y\|$.

8
On

The Cauchy-Schwarz inequality should just have $\|x\|\cdot\|y\|$ on the right hand side; you've got the statement of it mixed up.

In my post, the notation $(u|e_j)$ denotes the component of $u$ with respect to the unit vector $e_j$, which is equal to the inner product $\langle u,e_j\rangle$. This notation will never be used for anything other than a member of our orthonormal set $e_j$ in the second position.

Let $x_n$ be the projection onto the space spanned by the first $n$ of the $e_j$, and similarly for $y_n$. Then $$\sum_{j=1}^n (x|e_j)(y|e_j) = \langle x_n,y_n\rangle \le \|x_n\|\cdot \|y_n\| \le \|x\|\cdot\|y\|$$ There's no sum of $\|e_i\|^2$ there. And the equality is simply the inner product with respect to the standard (orthonormal) basis in $n$ dimensions.

Putting the absolute values in? Well, we could always rotate the $x|e_i$ and $y|e_i$ components to be positive; by orthogonality, that wouldn't affect $\|x_n\|$, $\|y_n\|$, or any of the other components.

To clarify about the absolute values:
Define $x'_n$ as follows: $x'_n=|(x|e_1)|e_1+|(x|e_2)|e_2+\cdots+|(x,e_n)|e_n$ - the sum of the absolute values of the components times the basis vectors - and similarly for $y'_n$. We claim that $\|x'_n\|=\|x_n\|$ and $\|y_n'\|=\|y_n\|$. Why? Because $$\|x'_n\|^2=\sum_{j=1}^n |(x_n,e_j)|^2 = \sum_{j=1}^n |(x,e_j)|^2=\|x_n\|^2$$ Then, applying this, $$\sum_{j=1}^n |(x|e_j)|\cdot |(y|e_j)| = \langle x_n',y_n'\rangle \le \|x_n'\|\cdot \|y_n'\| = \|x_n\|\cdot \|y_n\| \le \|x\|\cdot \|y\|$$ by Cauchy-Schwarz for the first inequality and Parseval for the second. Done.