How to show the equivalence between different versions of the spectral theorem?

52 Views Asked by At

Given the symmetric matrix $A$, the spectral theorem would be $A=U\Lambda U^T$, where $U$ contains the orthogonal eigenvectors of $A$ and $\Lambda$ is a diagonal matrix containing the eigenvalues of $A$. This, of course, requires that the eigenvectors are orthogonal. The theorem can be visualized as stretching the space in each dimension by the eigenvalue and then rotate equivalent to $U$ and $U^T$.

Now; suppose we are given the self-adjoint (hermitian) operator $A$. Since the operator $A$ is self-adjoint there exists by definition an orthogonal basis of eigenvectors, and we can write the spectral decomposition of this operator as $A=\sum_aaP(a)$. This states that the effect of applying $A$ to a vector is equivalent to projecting the vector onto each eigenvector subspace, scale by the eigenvalue and then sum up the results. But how can we show mathematically that these two expressions (spectral decompositions) of $A$ are equivalent, i.e. $\sum_aaP(a)=U\Lambda U^{\dagger}=A$?

1

There are 1 best solutions below

3
On BEST ANSWER

Using Dirac's bra-ket notation it is actually not too hard to show the equivalence of the two expressions.

$\Lambda$ is the diagonal matrix containing the eigenvalues of $A$. $$\Lambda=\begin{pmatrix} a_1 & 0 & \cdots & 0 \\ 0 & a_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & a_n \end{pmatrix} \tag{1}$$

The matrix $U$ contains the orthonormal ket-eigenvectors $|b_i\rangle$ as columns. $$U=\begin{pmatrix} |b_1\rangle & |b_2\rangle & \cdots & |b_n\rangle \end{pmatrix} \tag{2}$$

Likewise the matrix $U^\dagger$ contains the orthonormal bra-eigenvectors $\langle b_i|$ as rows. $$U^\dagger=\begin{pmatrix} \langle b_1| \\ \langle b_2| \\ \vdots \\ \langle b_n| \end{pmatrix} \tag{3}$$ (I guess, this may be called abuse of notation. But it helps to see the important things.)

Now we can put (1), (2) and (3) together, do the matrix multiplications and see what we get for $U\Lambda U^\dagger$. $$\begin{align} U\Lambda U^\dagger &=\begin{pmatrix} |b_1\rangle & |b_2\rangle & \cdots & |b_n\rangle \end{pmatrix} \begin{pmatrix} a_1 & 0 & \cdots & 0 \\ 0 & a_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & a_n \end{pmatrix} \begin{pmatrix} \langle b_1| \\ \langle b_2| \\ \vdots \\ \langle b_n| \end{pmatrix} \\ &=\begin{pmatrix} |b_1\rangle & |b_2\rangle & \cdots & |b_n\rangle \end{pmatrix} \begin{pmatrix} a_1\langle b_1| \\ a_2\langle b_2| \\ \vdots \\ a_n\langle b_n| \end{pmatrix} \\ &=|b_1\rangle a_1\langle b_1|+|b_2\rangle a_2\langle b_2|+\cdots+|b_n\rangle a_n\langle b_n| \\ &=\sum_i |b_i\rangle a_i\langle b_i| \\ &=\sum_i a_i|b_i\rangle\langle b_i| \quad \text{here we recognize $|b_i\rangle\langle b_i|$ as the projection operators $P_i$} \\ &=\sum_i a_iP_i \end{align}$$