multiplication of finite sum (inner product space)

4.7k Views Asked by At

I am having difficulty to understand the first line of the proof of theorem 3.22 below. (taken from a linear analysis book) enter image description here

Why need to be different index, i.e. $m,n$ when multiplying the two sums? This is very basic, but I really need help for explanations.

3

There are 3 best solutions below

3
On

That's because it is notationally bad and depending on how you read it, it may even give different results. For example, consider simple vectors in $\mathbb{R}^{3}$ given by $v=a_{1}e_{1}+a_{2}e_{2}+a_{3}e_{3}=\sum_{n=1}^{3}a_{n}e_{n}$. Then if we use same index, \begin{equation} \begin{aligned} ||v||^{2}&=\left(\sum_{n=1}^{3}a_{n}e_{n},\sum_{n=1}^{3}a_{n}e_{n}\right)\\ &=\sum_{n=1}^{3}\sum_{n=1}^{3}a_{n}\bar{a}_{n}(e_{n},e_{n})\\ &=\sum_{n=1}^{3}(a_{1}\bar{a}_{1}(e_{1},e_{1})+a_{2}\bar{a}_{2}(e_{2},e_{2})+a_{3}\bar{a}_{3}(e_{3},e_{3})) \end{aligned} \end{equation} which clearly does not make sense because we have extra summation. Even if you remove the extra summation, we get \begin{equation} \begin{aligned} ||v||^{2}&=\left(\sum_{n=1}^{3}a_{n}e_{n},\sum_{n=1}^{3}a_{n}e_{n}\right)\\ &=\sum_{n=1}^{3}\sum_{n=1}^{3}a_{n}\bar{a}_{n}(e_{n},e_{n})\\ &=a_{1}\bar{a}_{1}(e_{1},e_{1})+a_{2}\bar{a}_{2}(e_{2},e_{2})+a_{3}\bar{a}_{3}(e_{3},e_{3}) \end{aligned} \end{equation} which is notationally "looks" fine but wrong, because inner product should hold for any basis, not just orthonormal basis. The different indices allow us to use the power of good notations correctly.

3
On

The indices are different because the sums are independent from one another. If one intends to multiply the sums, then this distinction is a critical one.

Suppose that we were to multiply the summations $\sum_{n=1}^2a_n$ and $\sum_{n=1}^2b_n$ and naively failed to make this distinction. Then, we would have incorrectly

$$\begin{align} \left(\sum_{n=1}^2a_n\right)\left(\sum_{n=1}^2b_n\right)&=\sum_{n=1}^2\sum_{n=1}^2a_nb_n\\\\ &=2(a_1b_1+a_2b_2)\tag 1 \end{align}$$

However, the correct product that preserves the independence of the summation indices is given by

$$\begin{align} \left(\sum_{m=1}^2a_n\right)\left(\sum_{n=1}^2b_n\right)&=\sum_{m=1}^2\sum_{n=1}^2 a_mb_n=\sum_{m=1}^2a_m(b_1+b_2)\\\\ &=a_1b_1+a_1b_2+a_2b_1+a_2b_2 \tag 2 \end{align}$$

Evidently, $(1)$ fails to give the correct result. It has inappropriately mixed the "identities" of the terms to be multiplied and replaced those "cross terms" with an extra "inner-product" term.

0
On

Orthornormality refers to the basis $e_i$. When a basis is orthonormal it means the inner product between any two elements of the basis $e_i,e_j$ is $\langle e_i, e_j \rangle = \delta_{ij}$ (see kronecker delta). More generally, two vectors $u,v$ are orthogonal if $\langle u, v \rangle = 0$. The normality part comes from elements of the basis having norm $1$.

The part on algebraic properties refers simply to the bilinearity. If $u,v,w$ are vectors and $\alpha, \beta \in \mathbb{R}$, then $\langle \alpha \cdot u, v\rangle=\alpha \cdot \langle u, v \rangle = \langle u, \alpha \cdot v \rangle$ and $\langle u + v, w\rangle = \langle u, w\rangle + \langle v, w\rangle$ and $\langle u,v+w\rangle = \langle u, v\rangle+\langle u, w\rangle$.

The proof uses the fact that $\lVert v \rVert^2 = \langle v, v \rangle$ and the properties above to expand the inner product, then the orthonormality to simplify it.

In the end, this theorem says that for orthonormal bases, you can use Pythagoras's Theorem to calculate lengths.