Find $P$ such that $P^TAP = D$ where $D$ is a diagonal matrix.

2.4k Views Asked by At

Let $$A = \left(\begin{array}{cc} 2&3 \\ 3&4 \end{array}\right) \in M_n(\mathbb{C})$$

Find $P$ such that $P^TAP = D$ where $D$ is a diagonal matrix.

How can I find $P$? I am doing Gauss but it does not work?$$A = \left(\begin{array}{cc|cc} 2&3&1&0\\ 3&4&0&1 \end{array}\right) \sim \left(\begin{array}{cc|cc} 2&0&-8&6\\ 0&-1/2&-3/2&1 \end{array}\right)$$

What am I doing wrong? Steps would be much appreciated.

5

There are 5 best solutions below

6
On BEST ANSWER

You need to perform simulatenous row and column operations on the left hand side while performing only column operations on the right hand side. Then, when the left side becomes diagonal, the right side will be your $P$. In your case,

$$ \left(\begin{array}{cc|cc} 2&3&1&0\\ 3&4&0&1 \end{array}\right) \xrightarrow[C_2 = C_2 - \frac{3}{2}C_1]{R_2 = R_2 - \frac{3}{2}R_1} \left(\begin{array}{cc|cc} 2&0&1& -\frac{3}{2}\\ 0& -\frac{1}{2}&0&1 \end{array}\right) $$

and indeed

$$ \begin{pmatrix} 1 & 0 \\ -\frac{3}{2} & 1 \end{pmatrix} \begin{pmatrix} 2 & 3 \\ 3 & 4 \end{pmatrix} \begin{pmatrix} 1 & -\frac{3}{2} \\ 0 & 1 \end{pmatrix} = \begin{pmatrix} 2 & 0 \\ 0 & -\frac{1}{2} \end{pmatrix}. $$

0
On

I would love to hear what book you have that does this.

Meanwhile, take

$$ P = \left( \begin{array}{rr} 1 & - \frac{3}{2}\\ 0 & 1 \end{array} \right) $$

I asked for references here http://math.stackexchange.com/questions/1388421/reference-for-linear-algebra-books-that-teach-reverse-hermite-method-for-symmetr

I prefer to write this as a sequence of steps (if more than one is needed) which gives $P = P_1 P_2 \cdots P_r.$ Other people seemed to like the side-by-side grid that you display, so I guess there are advantages to both.

0
On

The characteristic polynomial of $A$ is $x^2-6x-1$, whose roots are $3\pm\sqrt{10}$. An example of an eigenvector with norm $1$ whose eigenvalue is $3+\sqrt{10}$ is $\frac1{\sqrt{20-2\sqrt{10}}}\bigl(\sqrt{10}-1,3\bigr)$ and an example of an eigenvector with norm $1$ whose eigenvalue is $3-\sqrt{10}$ is $\frac1{\sqrt{20-2\sqrt{10}}}\bigl(3,1-\sqrt{10}\bigr)$. So, take$$P=\frac1{\sqrt{20-2\sqrt{10}}}\begin{pmatrix}\sqrt{10}-1&3\\3&1-\sqrt{10}\end{pmatrix}$$and then $P^TAD=\left(\begin{smallmatrix}3+\sqrt{10}&0\\0&3-\sqrt{10}\end{smallmatrix}\right)$.

0
On

I think Jose comes closest to solving this problem in the way that I would, but his answer doesn't expand too much on the why of it all, so I'm writing up a separate answer.

If there is such a $D$ and $P$ such that $A = P^T D P$, then we say that $A$ is diagonalizable. One can prove that a valid diagonalization of $A$ sets $P:= \left\{\text{normalized eigenvectors of A}\right\}$, an orthogonal matrix (such that $P^T = P^{-1}$), and $D:=diag(\text{eigenvalues of A})$. So indeed, solving this problem reduces to finding the eigenvalues and eigenvectors of $A$. There are many ways to derive the eigensystem of a matrix, but below is my solution.


For an eigenvalue $\lambda$, we must have $\det(\lambda I - A) = 0 \implies \lambda^2 - 6 \lambda - 1 = 0$. The two solutions are $\lambda = 3 \pm \sqrt{10}$.

For $\lambda_1 = 3 + \sqrt{10}$, we have:

$$\begin{aligned} \mathbf{0} = (\lambda I-A)v &= \begin{pmatrix} 1+\sqrt{10} & -3\\ -3 & -1+\sqrt{10}\end{pmatrix}\begin{pmatrix}v_1\\ v_2 \end{pmatrix}\\ \implies v &= \begin{pmatrix} c\\ (1+\sqrt{10}) c/3\end{pmatrix} \end{aligned}$$

For $\lambda_2 = 3 - \sqrt{10}$, we have:

$$\begin{aligned} \mathbf{0} = (\lambda I-A)v &= \begin{pmatrix} 1-\sqrt{10} & -3\\ -3 & -1-\sqrt{10}\end{pmatrix}\begin{pmatrix}v_1\\ v_2 \end{pmatrix}\\ \implies v &= \begin{pmatrix} c \\ (1-\sqrt{10}) c/3\end{pmatrix} \end{aligned}$$

And while this is incredibly painful to normalize by hand, our matrices ends up being:

$$P = \begin{aligned} \cfrac{1}{\sqrt{20-2\sqrt{10}}}\begin{pmatrix}\sqrt{10} - 1 & 3\\ 3 & 1-\sqrt{10} \end{pmatrix},\qquad D = \begin{pmatrix}3+\sqrt{10} & 0\\ 0 & 3-\sqrt{10} \end{pmatrix} \end{aligned}$$

0
On

Problem

Diagonalize the matrix $$ \mathbf{A} = \left[ \begin{array}{cc} 2 & 3 \\ 3 & 4 \\ \end{array} \right] $$


Solution

Compute eigenvalues

The eigenvalues are the roots of the characteristic polynomial $$ p(\lambda) = \lambda^{2} - \lambda \text{ trace }\mathbf{A} + \det \mathbf{A} $$ The trace and determinant are $$ \text{ trace }\mathbf{A} = 6, \qquad \det \mathbf{A} = -1 $$ Therefore $$ p(\lambda) = \lambda^{2} - \lambda \text{ trace }\mathbf{A} + \det \mathbf{A} = \lambda^{2} - 6 \lambda - 1 $$ The roots are the eigenvalue spectrum $$ \lambda \left( \mathbf{A} \right) = 3 \pm \sqrt{10} $$

Result: $$ \mathbf{D} = \left[ \begin{array}{cc} 3+\sqrt{10} & 0 \\ 0 & 3-\sqrt{10} \\ \end{array} \right] $$

Eigenvectors

First $$ \begin{align} \left(\mathbf{A} - \lambda_{1} \mathbf{I}_{2} \right) w_{1} &= \mathbf{0} \\ % \left[ \begin{array}{cc} -1-\sqrt{10}-1 & 3 \\ 3 & 1-\sqrt{10} \\ \end{array} \right] % \left[ \begin{array}{c} w_{x} \\ w_{y} \\ \end{array} \right] % &= % \left[ \begin{array}{c} 0 \\ 0 \\ \end{array} \right] % \end{align} $$

Solution $$ w_{1} = \left[ \begin{array}{c} \frac{1}{3} \left(-1+\sqrt{10}\right) \\ 1 \\ \end{array} \right] $$

Second $$ \begin{align} \left(\mathbf{A} - \lambda_{2} \mathbf{I}_{2} \right) w_{2} &= \mathbf{0} \\ % \left[ \begin{array}{cc} -1+\sqrt{10} & 3 \\ 3 & 1+\sqrt{10} \\ \end{array} \right] % \left[ \begin{array}{c} w_{x} \\ w_{y} \\ \end{array} \right] % &= % \left[ \begin{array}{c} 0 \\ 0 \\ \end{array} \right] % \end{align} $$ Solution $$ w_{2} = \left[ \begin{array}{c} -\frac{1}{3} \left(1+\sqrt{10}\right) \\ 1 \\ \end{array} \right] $$

Diagonalization matrix

$$ \mathbf{P} = \left[ \begin{array}{cc} \frac{1}{3} \left(-1+\sqrt{10}\right) & -\frac{1}{3} \left(1+\sqrt{10} \right) \\ 1 & 1 \\ \end{array} \right], \qquad \mathbf{P}^{-1} = \frac{1}{2\sqrt{10}} \left[ \begin{array}{rr} 3 & 1+\sqrt{10} \\ -3 & -1+\sqrt{10} \\ \end{array} \right] $$


Validation

You can check that $$ \mathbf{P} \mathbf{A} \mathbf{P}^{-1} = \mathbf{D} $$ and $$ \mathbf{P}^{-1} \mathbf{D} \mathbf{P} = \mathbf{A} $$



Gaussian elimination

Solving this problem does not require Gaussian elimination. However, since you specifically asked, here is the process:

Clear column 1 $$ \left[ \begin{array}{rc} \frac{1}{2} & 0 \\ -\frac{3}{2} & 1 \\ \end{array} \right] % \left[ \begin{array}{cc|cc} 2 & 3 & 1 & 0 \\ 3 & 4 & 0 & 1 \\ \end{array} \right] = \left[ \begin{array}{cr|rc} 1 & \frac{3}{2} & \frac{1}{2} & 0 \\ 0 & -\frac{1}{2} & -\frac{3}{2} & 1 \\ \end{array} \right] $$

Clear column 2 $$ \left[ \begin{array}{cr} 1 & 3 \\ 0 & -2 \\ \end{array} \right] % \left[ \begin{array}{cr|rc} 1 & \frac{3}{2} & \frac{1}{2} & 0 \\ 0 & -\frac{1}{2} & -\frac{3}{2} & 1 \\ \end{array} \right] = \left[ \begin{array}{cc|rr} 1 & 0 & -4 & 3 \\ 0 & 1 & 3 & -2 \\ \end{array} \right] $$