Find a singular value decomposition for
$A$ = \begin{bmatrix}1&1&0\\1&0&1\end{bmatrix}
I know that the steps of finding an SVD for a matrix $A$ such that $A$ = $U$$\sum$$V^T$ are the following:
1) Find $A^T$$A$.
2) Find the eigenvalues of $A^T$$A$.
3) Find the eigenvectors of $A^T$$A$.
4) Set up $\sum$ using the positive eigengalues of $A^T$$A$, placing them in a diagonal matrix using the format of the original matrix $A$, with $0$ in all the other entries.
5) Normalize the eigenvectors of $A^T$$A$ to get the matrix $V$ (which will need to be transposed).
6) Find $U$ using the normalized eigenvectors of $V$, where ui = $\frac{1}{σ_{i}}$$A$vi, and where vi are the columns of $V$.
However, after (1), I get \begin{bmatrix}2&1&1\\1&1&0\\1&0&1\end{bmatrix}. Then, after using Row $3$ to find the eigenvalues, I get $(-1-λ) + (1-λ) [(2-λ)(1-λ)-1]$. This simplifies to $(-1+λ)-(λ^3-4λ^2+4λ+1)$.
Is this the correct way to go about this? This is Elementary Linear Algebra, yet just finding the eigenvalues and eigenvectors is quite drawn-out.
Any suggestions or tips would be helpful.
I think there is a mistake in your calculation. Notice that:
$$ \det(A^\top A - \lambda I) = \det\begin{pmatrix} 2 - \lambda & 1 & \fbox{$1$} \\ 1 & 1 - \lambda & 0 \\ 1 & 0 & \fbox{$1 - \lambda$} \end{pmatrix} = (1 - \lambda)\left[ (2 - \lambda)(1 - \lambda) - 1) \right] + 1 \cdot (0 -(1 - \lambda)) = (1 - \lambda)(2 + \lambda^2 - 3 \lambda - 2) = (1 - \lambda)(\lambda^2 - 3 \lambda) = \lambda (1 - \lambda)(\lambda - 3) $$ Therefore, you get $\lambda_1 = 0, \lambda_2 = 1, \lambda_3 = 3$.