Spectral decomposition using outer product notation

379 Views Asked by At

I have read that any normal operator, $M$, on a vector space $V$ is diagonal wrt some orthonormal basis for $V$.

The book I'm reading then says that $M$ can be written as $\sum_i \lambda_i |i\rangle\langle i|$ where the vectors are eigenvectors that make up the basis. I was under the impression that, when it is in this form (the eigenbasis), if you were to turn the outer products into matrices and add them the matrix would be diagonal, but when I do it for the matrix $$\begin{bmatrix}4&3\\3&4\end{bmatrix}$$ it just ends up back where I started. How does $\sum_i \lambda_i |i\rangle\langle i|$ indicate a diagonal matrix?

2

There are 2 best solutions below

0
On

A normal operator $M$ on a finite dimensional complex vector space $V$ is diagonal with respect to some orthonormal basis of $V$. The result does not hold if $V$ is real. The result says that there exists and orthonormal basis $v_1, \dots, v_n$ of $V$ and $\lambda_1, \dots, \lambda_n \in \mathbb{C}$ such that $Mv_i = \lambda_i v_i$ for each $i$. What is diagonal is the matrix representation of $M$ with respect to the basis $\{v_1, \dots, v_n\}$. The matrix representation is simply $\text{diag}(\lambda_1, \dots, \lambda_n)$.

The sum of outer products is just another way to write $M$: If $x \in V$ we have \begin{align} Mx &= M(\sum_{i = 1}^{n}(x, v_i)v_i) \\ &= \sum_{i = 1}^{n}(x, v_i)Mv_i \\ &= \sum_{i = 1}^{n}(x, v_i)\lambda v_i \\ &= \sum_{i = 1}^{n}(v_i^*x)\lambda v_i \\ &= \sum_{i = 1}^{n}\lambda v_i(v_i^*x) \\ &= \sum_{i = 1}^{n}\lambda v_iv_i^*x. \end{align} Thus $M = \sum_{i = 1}^{n}\lambda v_iv_i^*$.

0
On

When you write a linear operator as a matrix it is implicity done with respect to a basis. So when you write an operator $M$ as a matrix like

$$\begin{pmatrix} 4 & 3 \\ 3 & 4 \end{pmatrix}$$

you are implicitly assuming that you are using the (ordered) basis $\{ (1,0), (0,1)\}$. In particular, for a vector $w = (\alpha_1,\alpha_2) = \alpha_1 (1,0) + \alpha_2 (0,1)$ we have $$ Mw = (4 \alpha_1 + 3 \alpha_2)(1,0) + (3 \alpha_1 + 4 \alpha_2)(0,1). $$

A linear map can be written with respect to any basis, in which case the matrix representation changes.

Let $v_1$ and $v_2$ be unit eigenvectors for $M$ and let $\lambda_1$ and $\lambda_2$ be the corresponding eigenvalues (I don't care about what they are for now). If $M$ is normal then $\{v_1,v_2\}$ is an orthonormal basis of $\mathbb{R}^2$. In particular, we can always write a vector $w$ in the form $$ w = \beta_1 v_1 + \beta_2 v_2. $$ Here, $\beta_1$ and $\beta_2$ will both be linear combinations of $\alpha_1$ and $\alpha_2$. Using the eigenvector property $$Mw = M(\beta_1 v_1 + \beta_2 v_2) = \beta_1 Mv_1 + \beta_2 Mv_2 = \lambda_1 \beta_1 v_1 + \lambda_2 \beta_2 v_2.$$ So in effect, the operator $M$ has scaled the first coefficent $\beta_1$ by $\lambda_1$ and the second coefficent $\beta_2$ by $\lambda_2$. Hence, with repsect to the basis $\{v_1,v_2\}$, the operator $M$ can be written as the matrix $$ \begin{pmatrix} \lambda_1 & 0 \\ 0 & \lambda_2 \end{pmatrix} $$ as the vector $w$ may be written in this basis as $(\beta_1,\beta_2)$. It is in this sense that $M = \lambda_1 |v_1 \rangle \langle v_1| + \lambda_2 |v_2 \rangle \langle v_2|$ is diagonal.