Can someone please explain what does $ D= P^{-1} AP$ mean in simple terms?

4.8k Views Asked by At

I know how determine whether or not a matrix is diagonalizable, but after that the question will usually ask me to find an invertible matrix P and a diagonal matrix D such that $$ D= P^{-1} AP$$.

I was hoping someone would be able to explain what this means, why it's important, and maybe a very simple example.

2

There are 2 best solutions below

2
On BEST ANSWER

A linear transformation is diagonal if there is some basis where the matrix representation of that transformation is a diagonal matrix. A matrix is diagonalizable if it represents a diagonal linear transformation.

So $P$ is a change-of-basis matrix that takes us from the basis that $A$ is described in (say the standard basis) to whatever basis makes the linear transformation instead represented by the diagonal matrix $D$. The columns of $P$ will be eigenvectors of $A$ (described in the standard basis), and the entries of $D$ will be the corresponding eigenvalues in the same order.

Really, I'm more used to seeing $$A=PDP^{-1}$$ than the other way around. The most immediate use of this is how much easier it is to calculate high powers of the right-hand side than of the left-hand side: you just calculate the corresponding powers of the diagonal elements.

As an example, consider $$A=\begin{bmatrix}2&1\\1&2\end{bmatrix}$$(Symmetric matrices are always nice and diagonalizable, so it's a good place to look for examples.) Doing the rote calculations, we get $$ \begin{bmatrix}2&1\\1&2\end{bmatrix}=\begin{bmatrix}1&1\\1&-1\end{bmatrix}\begin{bmatrix}3&0\\0&1\end{bmatrix}\begin{bmatrix}\frac12&\frac12\\\frac12&-\frac12\end{bmatrix} $$ as, for instance, $\left[\begin{smallmatrix}1\\1\end{smallmatrix}\right]$ is an eigenvector of $A$ with eigenvalue $3$, so it's the first column of $P$ and the first entry of $D$.

Now consider trying to calculate $A^{500}$. Not a task I would at all enjoy doing by hand. But using the powers of diagonalization, we get $$ A^{500}=\left(PDP^{-1}\right)^{500}\\ =PD^{500}P^{-1} $$ And calculating $D^{500}$ is comparatively easy, assuming you know your powers of $3$.

0
On

It's good to know that linear transformations correspond to matrices, that this correspondence depends on choosing a basis and that replacing one basis by another replaces a matrix $A$ by a matrix $P^{-1}AP$ where $P$ expresses one basis in terms of the other but it is also good to know that the entire discussion can be framed in purely matrix terms without mentioning bases. Given a matrix $A$ there are many reasons for trying to find an invertible matrix $P$ such that $P^{-1}AP=D$ where $D$ is diagonal. One important application is to find powers of $A$, which is very useful, even if you don't care at all about bases or linear transformations but there are many other applications. For example, suppose you have a system of differential equations $$\begin{bmatrix}x_1'\\.\\.\\.\\x_n'\end{bmatrix}=A\begin{bmatrix}x_1\\.\\.\\.\\x_n\end{bmatrix}$$ and you have an invertible matrix $P.$ Then if you define new variables $u_1,...,u_n$ by $$\begin{bmatrix}x_1\\.\\.\\.\\x_n\end{bmatrix}=P\begin{bmatrix}u_1\\.\\.\\.\\u_n\end{bmatrix}$$ it follows that $$\begin{bmatrix}u_1'\\.\\.\\.\\u_n'\end{bmatrix}=P^{-1}AP\begin{bmatrix}u_1\\.\\.\\.\\u_n\end{bmatrix}.$$ If $P^{-1}AP=D$ where $D$ is diagonal, say with diagonal elements $\lambda_1,...,\lambda_n$, then it is very easy to solve the differential equations for $u_1,...,u_n.$ In fact, the solution is $$u_1=c_1e^{\lambda_1t},...,u_n=c_ne^{\lambda_nt}.$$Once the $u_i$ are known, it follows that $$\begin{bmatrix}x_1\\.\\.\\.\\x_n\end{bmatrix}=P\begin{bmatrix}c_1e^{\lambda_1t}\\.\\.\\.\\c_ne^{\lambda_nt}\end{bmatrix}$$. One last remark: I am a bit puzzled by your statement that you know how to determine whether $A$ is diagonalizable but apparently regard finding $P$ and $D$ as a separate question. The way that I answer these questions is to do them together. First, given an $n \times n$ real matrix, I try to solve the equation $\det(\lambda I-A)=0.$ If the roots are not all real, then we know immediately that $A$ is not diagonalizable over $\mathbb R.$ Suppose all the roots $\lambda_1,...,\lambda_m$ are real amd that the algebraic multiplcity of $\lambda_i$ in the characteristic equation $\det(\lambda I-A)=0$ is $\mu_i, 1 \le i \le m.$ Then for each $i$, I check whether or not there are $\mu_i$ linearly independent column-vectors $\mathbf v$ such that $(\lambda_i I-A)\mathbf v=0.$[Yes, I know that we have found a basis for the eigenspace and that I said we were not ging to talk about bases. I have put a little water in my wine.] If there is any $i$ for which this condition is not satisfied, the matrix $A$ is not diagonalizable. If this condition is satisfied for all $i$, then we set $D$ to be the diagonal matrix in which $\lambda_1,...,\lambda_m$ are the diagonal elements with each $\lambda_i$ repeated $\mu_i$ times. The matrix $P$ is formed by writing down side by side all the linearly independent column vectors $\mathbf v$ that we found for all the $\lambda_i.$