Find a scalar product of the vector space defined by its basis vectors

899 Views Asked by At

Let $X=\left( \begin{pmatrix}1 \\\ 1 \\\ 1 \\\ 1\end{pmatrix}, \begin{pmatrix}1 \\\ 1 \\\ 1 \\\ 0\end{pmatrix}, \begin{pmatrix}1 \\\ 1 \\\ 0 \\\ 0\end{pmatrix}, \begin{pmatrix}1 \\\ 0 \\\ 0 \\\ 0\end{pmatrix} \right)$ be orthonormal basis of $\mathbb{R}^4$ .

What does the scalar product $(<\vec{x}^{\,}|\vec{y}{\,}>)$ look like?

First of all, I though of exploiting the known fact about orthonormal basis: $(\forall \vec{x}^{\,},\vec{y}^{\,} \in \mathbb{R}^4)(<\vec{x}^{\,}|\vec{y}{\,}>=\sum <\vec{x}^{\,}|\vec{x_i}{\,}><\vec{y}^{\,}|\vec{x_i}{\,}>)$, where $\vec{x_i}{\,}$ are coresponding basis vectors.

Considering that orthonormal vectors are defined this way: $<\vec{x_j}^{\,}|\vec{x_k}{\,}> = \delta_{j, k}$

Ill get $<\vec{x}^{\,}|\vec{y}{\,}>=\sum x^\#_i(\vec{x}{\,})x^\#_i(\vec{y}{\,}).$

After substituing $<\vec{x_i}^{\,}|\vec{x_i}{\,}>$ for the appropriate vector (for instance the first one) It does not equal $1$ as presumed.

What is wrong ? Where did I fail?

1

There are 1 best solutions below

1
On BEST ANSWER

It seems as if you’ve gotten things a bit backwards. When you express the vectors in coordinates relative to $X$, then this scalar product can be expressed as the dot product $\sum_i v_iw_i$. Relative to this basis, though, the coordinates of the basis vectors are simply $(1,0,0,0)$, $(0,1,0,0)$, $(0,0,1,0)$ and $(0,0,0,1)$, and their dot products obviously have the expected values. If you compute the dot product of the coordinate vectors relative to the standard basis, on the other hand, you shouldn’t expect that you’ll get the right values, since that dot product represents a different scalar product. You can, however, use those values to compute this scalar product as I’ll show at bottom.

In any given basis $\mathcal B$, an inner product can be represented by a positive-definite matrix $A$, i.e., $\langle\mathbf v,\mathbf w\rangle = [\mathbf v]_{\mathcal B}^TA[\mathbf w]_{\mathcal B}$. If we change basis via the matrix $B$, we have $$\langle\mathbf v,\mathbf w\rangle = (B^{-1}B[\mathbf v]_{\mathcal B})^TA(B^{-1}B[\mathbf w]_{\mathcal B}) = (B[\mathbf v]_{\mathcal B})^T (B^{-T}AB^{-1}) (B[\mathbf w]_{\mathcal B}), i.e.,$$ the inner product matrix $A$ transforms as $B^{-T}AB^{-1}$. In the given orthonormal basis, $A=I$ and the matrix $B$ that has those vectors as its columns is the change-of-basis matrix to the standard basis $\mathcal E$, so you need to compute $B^{-T}B^{-1}$ to find the expression of this inner product in the standard basis.

Observe that $B^{-T}B^{-1} = (BB^T)^{-1}$. The elements of $BB^T$ are pairwise dot products of the elements of $X$ (expressed relative to the standard basis). This is known as the Gram matrix of $X$. So, a different way of describing the above result is that the matrix of the inner product that makes a given basis orthonormal is the inverse of its Gram matrix. You can think of the Gram matrix as encapsulating all of the “overlap” or “cross-talk” among the elements of $X$; inverting it sorts all of that out and in a sense separates the directions that those vectors represent.