Diagonalization of a symmetric matrix

370 Views Asked by At

I'm trying to prove that every symmetric matrix is diagonalizable. I know that there are already many answers regarding this question, but I just want to check out whether my approach is correct or not. After some searching, I found some clues to my problem (Here). Based on Tunococ's answer, I want to prove the following : For every symmetric matrix (with real entries), there exists an orthogonal matrix B and a diagonal matrix D such that $$ B^{T}AB = D $$ Let's use induction on orders of $A$. For $1 \times 1$ matrix (which is symmetric), this is trivial. Suppose that this result is true for $k \times k$ symmetric matrix ($k\geq1$). Let $A$ be a $(k+1) \times (k+1)$ symmetric matrix. Let $\lambda$ be any eigenvalue of $A$ and $v$ be an associated eigenvector. Since an eigenvector multiplied by any nonzero scalar is still an eigenvector, we can assume that $\left \| v \right \|=1$. By Gram-Schmidt process, we can obtain an orthonormal basis $\{v, w_2, \ldots, w_{k+1}\}$ of $\mathbb{R}^{k+1}$. Let C be an orthogonal matrix defined by $$ C = \begin{bmatrix} | & | & \ldots & |\\ v & w_2 & \ldots & w_{k+1} \\ | & | & \ldots & | \end{bmatrix} $$
Then, after some computation, we obtain $$C^{T}AC =\begin{bmatrix} \lambda & O_{1 \times k} \\ O_{k \times 1} & F \end{bmatrix}.$$ Here, $F$ is a $k \times k$ symmetric matrix. By the induction hypothesis, there exists an orthogonal matrix $G$ and a diagonal matrix $H$ such that $G^{T}FG=H$. Let $$J =\begin{bmatrix} 1 & O_{1 \times k} \\ O_{k \times 1} & G \end{bmatrix}.$$ Since $G$ is an orthogonal matrix, $J$ is also an orthogonal matrix. By computation, we obtain $$(CJ)^{T}A(CJ)=\begin{bmatrix} \lambda & O_{1 \times k} \\ O_{k \times 1} & H \end{bmatrix}.$$ Note that since $C$ and $J$ are orthogonal matrices, their product $CJ$ is also an orthogonal matrix.

Is this argument correct?

1

There are 1 best solutions below

8
On

No, you need to explain why you can find an eigenpair $(\lambda, v)$.