Symmetric matrix congruency

1.6k Views Asked by At

There is a sentense that says that every symmetric matrix is congruent to a diagonal matrix.

I've been trying to find the congruent matrix and the transition matrix for the following: $$ \begin{pmatrix} 1 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & 1 \end{pmatrix} $$

The method I learned to get to a solution is to do row operations on both the matrix and the identity matrix, and the according column operations on the original matrix only until I get to a diagonal matrix from the original and the transition matrix from the identity matrix.

This process seems to loop infinitly in with this matrix when I switch rows 2 and 3.

So I need to know how to find the congruent diagonal matrix and the transition matrix for the given matrix and a method which will be fail-proof.

3

There are 3 best solutions below

7
On BEST ANSWER

Rather than switching the columns, add the third row to the second, and work from there.

I am not familiar with the algorithm you describe, and so I can't tell you why switch the rows would cause it to fail. However, you should find that $$ P = \pmatrix{1&0&0&0\\ 0&1&1&0\\ 0&1&-1&0\\ 0&0&0&1} $$ will work for your purposes (incidentally, note that $P = P^T$).

In terms of row-column operations, here's how it would go: $$ \pmatrix{1&0&0&0\\0&0&1&0\\0&1&0&0\\0&0&0&1} \quad \text{...II = II + III}\\ \pmatrix{ 1&0&0&0\\ 0&2&1&0\\ 0&1&0&0\\ 0&0&0&1} \quad \text{...III= 2 III - II}\\ \pmatrix{ 1&0&0&0\\ 0&2&0&0\\ 0&0&-2&0\\ 0&0&0&1} $$ I'm not sure if this is correct

1
On

A fool proof algorithm :)

1) Find the eigenvalues. The eigenvalues are the roots of the polynomial $det(A-\lambda I)$ The eigenvalues are also the values that you will see on the diagonal.

2) Find the eigenvectors, and insert them as columns of a matrix named $P$.

Edit: in your case, since the matrix is symmetric, not only will you find a basis of eigenvectors, you will find an orthonormal basis of eigenvectors.

3) check that $P^{-1}AP = D$ where $D$ is a diagonal matrix with the eigenvalues on the diagonal.

0
On

Although I'm quite late, I think I've got the solution acording to your algorithm.

As far as I know the algorithm is known as "symmetric Gaussian elimination". It's basically the normal Gaussian elimination for finding the inverse of a matrix, where you record the row operations on the right in the identity matrix. But for every row operation you do the exact same operation for the columns without recording it in the identity matrix and you stop as soon as you have gotten a diagonal matrix.

So here you go (rows in Latin, columns in Arabic):

$$ \left[ \begin{array}{cccc|cccc} 1&0&0&0 & 1&0&0&0\\ 0&0&1&0 & 0&1&0&0\\ 0&1&0&0 & 0&0&1&0\\ 0&0&0&1 & 0&0&0&1\\ \end{array} \right] \quad (\;A\;\;|\;\;\mathbb{1}\;) $$

$$ \left[ \begin{array}{cccc|cccc} 1&0&0&0 & 1&0&0&0\\ 0&1&1&0 & 0&1&1&0\\ 0&1&0&0 & 0&0&1&0\\ 0&0&0&1 & 0&0&0&1\\ \end{array} \right] \quad \text{II = II+III}\\ \left[ \begin{array}{cccc|cccc} 1&0&0&0 & 1&0&0&0\\ 0&2&1&0 & 0&1&1&0\\ 0&1&0&0 & 0&0&1&0\\ 0&0&0&1 & 0&0&0&1\\ \end{array} \right] \quad \text{2 = 2 + 3}\\ $$

$$ \left[ \begin{array}{cccc|cccc} 1&0&0&0 & 1&0&0&0\\ 0&2&1&0 & 0&1&1&0\\ 0&3&1&0 & 0&1&2&0\\ 0&0&0&1 & 0&0&0&1\\ \end{array} \right] \quad \text{III = III+II}\\ \left[ \begin{array}{cccc|cccc} 1&0&0&0 & 1&0&0&0\\ 0&2&3&0 & 0&1&1&0\\ 0&3&4&0 & 0&1&2&0\\ 0&0&0&1 & 0&0&0&1\\ \end{array} \right] \quad \text{3 = 3 + 2}\\ $$

$$ \left[ \begin{array}{cccc|cccc} 1&0&0&0 & 1&0&0&0\\ 0&2&3&0 & 0&1&1&0\\ 0&0&-1&0 & 0&-1&1&0\\ 0&0&0&1 & 0&0&0&1\\ \end{array} \right] \quad \text{III = 2*III - 3*II}\\ \left[ \begin{array}{cccc|cccc} 1&0&0&0 & 1&0&0&0\\ 0&2&0&0 & 0&1&1&0\\ 0&0&-2&0 & 0&-1&1&0\\ 0&0&0&1 & 0&0&0&1\\ \end{array} \right] \quad \text{3 = 2*3 - 3*2}\\ $$

$$ \left[ \begin{array}{cccc|cccc} 1&0&0&0 & 1&0&0&0\\ 0&2&0&0 & 0&1&1&0\\ 0&0&-2&0 & 0&-1&1&0\\ 0&0&0&1 & 0&0&0&1\\ \end{array} \right] \quad (\;D\;\;|\;\;P\;) $$

Please let me know if I made any errors in the calculation.