Let $U$ be the subspace of $\mathbb{R}^5$, which is through $(1,2,3,-1,2)^T$ and $(1,0,-1,0,1)^T$ spanned. How do I find orthonormal basis of $U$?
How do I find orthonormal basis of $U$?
636 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 4 best solutions below
On
The two vectors that you provided are already orthogonal to each other, that is their inner product is $0$.
You just need to divide each of them by their norm.
In general, check out Gram-Schmidt process.
On
Use the Gram-Schmidt procedure.
This procedure is very well known, so I'll just link you to a treatment of it that will help you: https://www.math.purdue.edu/academic/files/courses/2010spring/MA26200/4-12.pdf
On
Orthonormal is orthogonal and normalized.
Normalization can be achieved for every vector separately, so can be done "in the end", when orthogonality has beed obtained.
Now orthogonality: we have two vectors $\vec a$ and $\vec b$ and need to find two orthogonal vectors that span the same space. So these must be two independent linear combinations of $\vec a$ and $\vec b$, let $\alpha\vec a+\beta \vec b,\gamma\vec a+\delta\vec b$.
Let us express orthogonality with
$$(\alpha\vec a+\beta \vec b)\cdot(\gamma\vec a+\delta\vec b)=0 =\alpha\gamma\vec a^2+(\alpha\delta+\beta\gamma)\vec a\vec b+\beta\delta \vec b^2.$$
We have a single equation for four unknowns, this is too much. We are free to choose three of them arbitrarily, and we will set $\alpha=1,\beta=0,\delta=1$, so that both vectors $\vec a,\vec b$ are represented and we are sure that the linear combinations are independent.
Now remains
$$\vec a\cdot(\gamma\vec a+\vec b)=0=\gamma\vec a^2+\vec a\cdot\vec b,$$
giving
$$\gamma=-\frac{\vec a\cdot\vec b}{\vec a^2}.$$
The orthogonal vectors are
$$\vec a,\text{ and }\vec b-\frac{\vec a\cdot\vec b}{\vec a^2}\vec a.$$
Presumably, this is with respect to the "standard" inner product.
Perform a single Gram-Schmidt step. If $$a=\begin{bmatrix} 1\\2\\3\\-1\\2\end{bmatrix}\text{ and } b=\begin{bmatrix}1\\0\\-1\\0\\1\end{bmatrix}$$ then $$a^*=a\text{ and } b^*=b-\frac{a^*\cdot b}{a^*\cdot a^*}a^*$$ or $$a^*=\begin{bmatrix} 1\\2\\3\\-1\\2\end{bmatrix}\text{ and } b^*=\begin{bmatrix}1\\0\\-1\\0\\1\end{bmatrix}.$$ Ha! - turns out they were already orthogonal since $a^* \cdot b = 0$. To make them orthonormal, divide each one by its length:$$a^{**}=\frac{1}{\sqrt{19}}\begin{bmatrix} 1\\2\\3\\-1\\2\end{bmatrix}\text{ and } b^{**}=\frac{1}{\sqrt{3}}\begin{bmatrix}1\\0\\-1\\0\\1\end{bmatrix}.$$