Show that spectral decomposition has same eigenvalues than the matrix its decomposed from.

554 Views Asked by At

Problem

Let $\textbf{A} \in \mathbb{R}^{n\times n}$ be a symmetric square matrix. $\textbf{A}$ has $n$ orthonormal eigenvectors $x_i \in \mathbb{R}^n$ and corresponding eigenvalues $\lambda_i \in \mathbb{R}$. Matrix $\textbf{B}$ is defined as $ \textbf{B} = \sum_{j=1}^{n} \lambda_jx_jx^T_j $ Show that $\lambda_i$ and $x_i$ are eigenvalues and eigenvectors of the matrix $\textbf{B}$ as well.

Attempt to solve

I've tried simply computing the eigenvalues of matrix $\textbf{B}$ by trying to solve equation

$$ \text{det}(B-\lambda I) = 0 \iff \text{det}((\sum_{j=1}^{n}\lambda_j x_j x^t_j)-\lambda I) = 0 $$ but then having difficulties trying to compute the determinant this way. Not quite sure if this would be the good way to show this, but it's first one that comes to my mind.

This should be quite trivial to show since isn't this true by definition of $\textit{spectral decomposition}$ which would be the matrix $\textbf{B}$.

I am not too familiar with matrix decomposition's and things like this. I know how to compute eigenvalues and eigenvectors and have roughly some intuition what they mean geometrically. However these come from computing eigenvalues and eigenvectors for real-valued matrices which are defined with numerical values in certain dimension. Haven't dealt with any proofs that involve eigenvalues or vectors before.

2

There are 2 best solutions below

1
On

Hint: you know what the eigenvectors/eigenvalues are supposed to be, so why not check to see if they work?

0
On

Hint: because the eigenvectors are orthonormal, $x_i^T x_j = \ldots$