Matrix diagonalization. Is $A = PDP^{-1} = P^{-1}DP$?

6.9k Views Asked by At

Diagonalization of a square matrix $A$ consists in finding matrices $P$ and $D$ such that $A=PD P^{-1}$ where $P$ is a matrix composed of the eigenvectors of $A$, $D$ is the diagonal matrix constructed from the corresponding eigenvalues, and $P^{-1}$ is the matrix inverse of $P$

I wonder if $PDP^{-1} = P^{-1}DP$ ?

Can I say $A=P^{-1}DP$ too?

4

There are 4 best solutions below

0
On BEST ANSWER

As $P^{-1}$ usually isn't equal to $P$, we don't usually have $PDP^{-1} = P^{-1}DP$. And thus we usally don't have $A = P^{-1}DP$ if $P$ is the matrix consisting of eigenvectors of $A$. However, if we set $Q = P^{-1}$, then we do have $A = PDP^{-1} = Q^{-1}DQ$. So in some sense, we could've done diagonalisation that way. It is a matter of convention that we don't. The convention is chosen because $P$ is easier to describe than $Q$.

So you can say that $A = P^{-1}DP$, but then $P$ wouldn't be the matrix consisting of eigenvectors of $A$ (although the rows of $P$ in this case would be the eigen-row-vectors of $A$, when considering the multiplication $vA$).

When $P$ is the matrix consisting of (column) eigenvectors of $A$, what $PDP^{-1}$ does is that it decomposes "Multiplication by $A$" into three operations:

  1. Multiplying a column vector by $P^{-1}$ changes basis to a basis of eigenvectors of $A$. Given a column vector $v$, where the entries are the coefficients of a linear combination using your current basis, $P^{-1}v$ will be the coefficients describing the decomposition of the same vector using the eigenvectors of $A$ as a basis instead.
  2. Multiplying the result of this decomposition by $D$ stretches each of the eigenvectors of $A$ by their respective eigenvalues. So $DP^{-1}v$ expresses the same vector as $Av$, but in the basis of eigenvectors of $A$.
  3. Multiplying by $P$ translates back into whatever basis you had before.

For this process, it is crucial that the inverses are done at the right places.

0
On

You don't even have to know what an eigenvalue is in order define diagonalization of a square matrix $A$: it is a diagonal matrix $D$ such that, for some square invertible matrix $P$ you have $A=PDP^{-1}$. Which, of course, is the same thing as asserting that, for some square invertible matrix $P$ you have $A=P^{-1}DP$.

So, the answer to your question is affirmative.

0
On

Calculating a few examples yourself often invalidates a wrong guess. The first matrix I tried -$$PDP^{-1}= \begin{bmatrix} 1 & 4 \\ 2 & 3 \end{bmatrix}, \text{where}\\ P= \begin{bmatrix} -2 & 1 \\ 1 & 1 \end{bmatrix}\\ D = \begin{bmatrix}-1 & 0 \\ 0 & 5 \end{bmatrix}\\ P^{-1} =\begin{bmatrix} -1/3 & 1/3 \\ 1/3 & 2/3\end{bmatrix};\\ P^{-1}D P = \begin{bmatrix} 1 & 2 \\ 4 & 3 \end{bmatrix} \neq PDP^{-1}. $$

0
On

No,

$$\begin{pmatrix}1&2\\1&0\end{pmatrix} \begin{pmatrix}2&0\\0&1\end{pmatrix} \begin{pmatrix}0&1\\\frac12&-\frac12\end{pmatrix} =\begin{pmatrix}1&1\\0&2\end{pmatrix}$$ while

$$\begin{pmatrix}0&1\\\frac12&-\frac12\end{pmatrix} \begin{pmatrix}2&0\\0&1\end{pmatrix} \begin{pmatrix}1&2\\1&0\end{pmatrix}=\begin{pmatrix}1&0\\\frac12&2\end{pmatrix}.$$