Extending $\{u_1, u_2\}$ to an orthonormal basis when finding an SVD

5.5k Views Asked by At

I've been working through my linear algebra textbook, and when finding an SVD there's just one thing I don't understand.

For example, finding an SVD for the 3x2 matrix A. I will skip the steps of finding the eigenvectors, eigenvalues, singular values... anyway, we find that

$$ V = \begin{bmatrix}\vec{v}_1 & \vec{v}_2\end{bmatrix} = \begin{bmatrix} 1/\sqrt2 & -1/\sqrt2\\ 1/\sqrt2 & 1/\sqrt2 \end{bmatrix} $$

and we know that

$$ \vec{u}_n = \frac{1}{\sigma_n}A\vec{v}_n $$

which gives

$$ \vec{u}_1 = \begin{bmatrix}2/\sqrt6\\1/\sqrt6\\1/\sqrt6\end{bmatrix}, \vec{u}_2 = \begin{bmatrix}0\\-1/\sqrt2\\1/\sqrt2\end{bmatrix} $$

but we know that $U$ is a $m$ by $m$ matrix, so it must be 3 by 3, and so we have to find $\vec{u}_3$. This is where I get stuck; the book says that one method to do this is to use the Gram-Schmidt Process, but I just can't seem to wrap my head around how to do this with the vectors shown above.

1

There are 1 best solutions below

0
On

There are a few ways to approach this problem.

Eyeball method

Scrape off the distractions of normalization. The column vectors are $$ \tilde{v}_{1} = % \left[ \begin{array}{c} 2 \\ 1 \\ 1 \end{array} \right], \qquad % \tilde{v}_{2} = % \left[ \begin{array}{r} 0 \\ -1 \\ 1 \end{array} \right]. % $$ Find a vector perpendicular to both. One such solution is $$ \tilde{v}_{3} = % \left[ \begin{array}{r} -1 \\ 1 \\ 1 \end{array} \right]. $$

Systematic approach

Start with $$ \mathbf{A} = \left[ \begin{array}{cr} 2 & 0 \\ 1 & -1 \\ 1 & 1 \\ \end{array} \right]. $$ Find the nullspace $\mathcal{N}\left(\mathbf{A}^{*} \right)$. The row reduced form is $$ \begin{align} % \mathbf{A}^{T} &\mapsto \mathbf{E}_{\mathbf{A}^{T}} \\ % \left[ \begin{array}{crc} 2 & 1 & 1 \\ 0 & -1 & 1 \\ \end{array} \right] % &\mapsto % \left[ \begin{array}{ccr} 1 & 0 & 1 \\ 0 & 1 & -1 \\ \end{array} \right] \end{align} $$ In terms of basic variables, $$ \begin{align} x_{1} &= -x_{3}, \\ x_{2} &= x_{3}. \end{align} $$ Making the natural choice that $x_{3}=1$ produces the column vector $$ % \left[ \begin{array}{r} x_{1} \\ x_{2} \\ x_{3} \end{array} \right] % = % \left[ \begin{array}{r} -1 \\ 1 \\ 1 \end{array} \right] $$

Gram-Schmidt

Make any choice for the third vector and use the process of Gram and Schmidt to make it an orthogonal vector. A wise choice to begin is $$ \tilde{v}_{3} = \left[ \begin{array}{c} 0 \\ 0 \\ 1 \end{array} \right] $$ Why is this a wise choice? It is rich in $0$s, which make manipulation easy.

Define the operator which projects the vector $u$ onto the vector $v$ as $$ p_{u\to v} = \frac{v\cdot u}{u \cdot u} u $$ The Gram-Schmidt process fixes $v_{3}$ using the prescription $$ v_{GS} = v_{3} - \frac{v_{3} \cdot v_{1}} {v_{1} \cdot v_{1}} v_{1} - \frac{v_{3} \cdot v_{2}} {v_{2} \cdot v_{2}} v_{2} $$

$$ \frac{1}{3} \left[ \begin{array}{r} -1 \\ 1 \\ 1 \end{array} \right] % = % \left[ \begin{array}{c} 0 \\ 0 \\ 1 \end{array} \right] % - % \frac{1}{6} \left[ \begin{array}{c} 2 \\ 1 \\ 1 \end{array} \right] % - % \frac{1}{2} \left[ \begin{array}{r} 0 \\ -1 \\ 1 \end{array} \right] % $$

The normalized form is the column vector you want $$ \frac{1}{\sqrt{3}} \left[ \begin{array}{r} -1 \\ 1 \\ 1 \end{array} \right] $$