I have an example:
$$A=\begin{pmatrix} 2 & 2 & 4 \\ 2 & 5 & 8 \\ 4 & 8 & 17 \end{pmatrix}$$
The eigenvalue I found is $\lambda_1=\lambda_2=1$ and $\lambda_3=22$.
For $\lambda=1$,
$$\begin{pmatrix} x\\ y \\ z \end{pmatrix}=\begin{pmatrix} -2\\ 1 \\ 0 \end{pmatrix}y+\begin{pmatrix} -4\\ 0 \\ 1 \end{pmatrix}z$$
For $\lambda=22$,
$$\begin{pmatrix} x\\ y \\ z \end{pmatrix}=\begin{pmatrix} 1/4\\ 1/2 \\ 1 \end{pmatrix}z$$
However, those eigenvectors I found are not orthogonal to each other. The goal is to find an orthogonal matrix P and diagonal matrix Q so that $A=PQP^T$.
How to find orthogonal eigenvectors if some of the eigenvalues are the same?
4.3k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
We know that the eigenvectors corresponding to different eigenvalues of a symmetric matrix are orthogonal. You have two different eigenvalues, hence you have two orthogonal eigenvectors $v_1$ and $v_2$. Since your matrix is $3\times 3$, the third vector to form $P=[v_1 | v_2 |v_3]$ has to be $v_3=\pm v_1\times v_2$. It is easy to see that $PP^T=I$.
Now just take $Q=\mathrm{diag}(\lambda_1,\lambda_2,\lambda_3)$ and solve $A=PQP^T$ to determine $Q$ completely and then you're done.
On
How about Gram-Schmidt? Since the eigenspace is $2$-dimensional, there are certainly $2$ such.
Project and subtract: $(-4,0,1)-8\frac15(-2,1,0)= (-\frac45,-\frac85,1)$.
Now normalize: $\frac5{23}(-\frac45,-\frac85,1)=(-\frac4{23},-\frac8{23},\frac5{23}):=b_1$. And $(-\frac2{\sqrt5},\frac1{\sqrt5},0):=b_2$.
Finally, normalize the eigenvector for $\lambda =22$: $\frac{16}{21}(\frac14,\frac12,1)=(\frac4{21},\frac8{21},\frac{16}{21}):=b_3$. Conveniently, this one is orthogonal to the others by symmetry of the matrix.
(Alternatively, the cross-product would have been a good way to do this as well.)
Finally, the matrix $P$ whose columns are the basis vectors, $b_1,b_2,b_3$, above will do the trick: $P^tAP=\begin{pmatrix}1&0&0\\0&1&0\\0&0&22\end{pmatrix}$.
One thing we know is that eigenvectors of a symmetric matrix corresponding to distinct eigenvalues are orthogonal. So, if we find eigenvectors $v_1,v_2,v_3$ for $\lambda_1< \lambda_2< \lambda_3$ we are done. On the other hand, we have eigenvalues $\lambda_1=\lambda_2=1$ and $\lambda_3=22$, so that there are not $3$ distinct eigenvalues and the situation becomes somewhat more complicated.
Suppose we found $v_1,v_2\in E(A,\lambda_1)$ which are linearly independent (and hence a basis for the Eigenspace). We know that $v_1\perp v_3$ and $v_2\perp v_3$. This means $\langle v_1,v_3\rangle=\langle v_2,v_3\rangle=0$. By bilinearity of the inner product, we get that $\langle av_1+bv_2,v_3\rangle =0$ for all $a,b\in \mathbb{R}$. The upshot is that the entire eigenspace $E(A,\lambda_1)$ is orthogonal to $v_3$. So, we are free to choose any basis of eigenvectors for $E(A,\lambda_1)$ and proceed from there. Well, just apply Gram-Schmidt to $v_1,v_2$. Define $$ u_1=\frac{v_1}{\lVert v_1\rVert}$$ $$ u_2=\frac{v_2-\langle v_2, u_1\rangle u_1}{\lVert v_2-\langle v_2, u_1\rangle u_1\rVert}.$$ A quick check shows that these two vectors form an orthonormal basis for $E(A,\lambda_1)$. Then, if we take any nonzero $v_3\in E(A,\lambda_3)$ and set $$ u_3=\frac{v_3}{\lVert v_3\rVert}$$ we can see that $(u_1,u_2,u_3)$is an orthonormal eigenbasis of $\mathbb{R}^3\cong E(\lambda_1,A)\oplus E(\lambda_3,A)$ with respect to $A$. You've already found the vectors $v_1,v_2,v_3$. Once you compute $u_1,u_2,u_3$, the matrix $P=[u_1,u_2,u_3]$ is orthogonal and $$ A=P^T \begin{bmatrix} 1&0&0\\ 0&1&0\\ 0&0&22 \end{bmatrix} P. $$