Proving that a matrix is symmetric if it can be expressed as a spectral decomposition

615 Views Asked by At

If $\{u_1, \cdots, u_n\}$ is an orthonormal basis for $\mathbb{R}^n$, and if $A$ can be expressed as $$A = c_1u_1u_1^T + \cdots + c_nu_nu_n^T$$ then $A$ is symmetric and has eigenvalues $c_1, \cdots, c_n$.

I'm trying to prove this. Here's what I have so far.


I figure I need to show:

  1. $A$ is symmetric. I can achieve this by showing that $A$ has an orthonormal set of $n$ eigenvectors (or equivalently, that $A$ is orthogonally diagonalizable). If $P$ orthogonally diagonalizes $A$ then $D = P^TAP \equiv A = PDP^T$. $(PDP^T) = (PDP^T)^T$ by trivial manipulations, knowing that $D$ is diagonal and thus $D^T = D$.
  2. $c_1, \cdots, c_n$ are the eigenvalues of $A$.

I think both of these would be satisfied if I could show that $c_1u_1u_1^T + \cdots + c_nu_nu_n^T$ was equivalent to $PDP^T$ for the orthogonal matrix $P$ and a $D$ such that $D_{ij} = \begin{cases}0 & i \neq j\\c_i & i = j\end{cases}$.

If $P$ was an orthogonal matrix such that $P = \begin{bmatrix} u_1 & \cdots & u_n\end{bmatrix}$ where $u_j$ was an eigenvector of $A$ then it would also be a basis for $\mathbb{R}^n$, since we'd have $n$ linearly independent vectors. If I had this then I believe you can do the tedious matrix multiplication and get $PDP^T$ given the $D$ defined above and receive $A = c_1u_1u_1^T + \cdots + c_nu_nu_n^T$. Then I'd be done.

But to me the question implies that any orthonormal basis for $\mathbb{R}^n$ would satisfy this. Perhaps I need to show that if $A$ can be expressed with those basis vectors then those basis vectors must be the eigenvectors of $A$. I'm kind of stuck on this part though!

Edit: To be clear: I have outlined here my approach to the proof and what I know to be true. I'm ultimately stuck on how to prove the quoted question. I am asking how one can prove this.


This is exercise 7.2.26 of Anton and Rorres' Elementary Linear Algebra, 11th ed.

3

There are 3 best solutions below

2
On BEST ANSWER

I'm not quite sure what you're asking in your question, but if its helpful, here's how I would write this proof.

1) If $$A=\sum_{i=1}^n c_iu_iu_i^T,$$then observe that $$A^T=\left(\sum_{i=1}^nc_iu_iu_i^T\right)^T=\sum_{i=1}^n c_i(u_i^T)^Tu_i^T=\sum_{i=1}^n c_iu_iu_i^T=A,$$ where the second equality follows since taking transposes reverses the order of multiplication for matrices, and we can always pull constants out front.

2) If $A$ has the form above, then to show $c_j$ is an eigenvalue, consider the following product: $$Au_j= \sum_{i=1}^nc_iu_iu_i^Tu_j=\sum_{i=1}^nc_iu_i\delta_{ij}=c_ju_j.$$ The second equality follows from the fact that the $u_i$ form an orthonormal basis so $u_i^Tu_j=\delta_{ij}$ (by definition of orthonormal).

4
On

By inspection from the hypotesis we have that

$$A^T= (c_1u_1u_1^T + \cdots + c_nu_nu_n^T)^T=A$$

and

$$A\cdot u_i=c_iu_i$$

therefore the thesis follows.

0
On

Note that

$(u_i u_i^T)^T = (u_i^T)^Tu_i^T = u_i u_i^T; \tag 1$

thus each matrix $u_i u_i^T$ is symmetric; hence every $c_i u_i u_i^T$ and hence their sum. This shows that

$A^T = A. \tag 2$

We further note that, since the $u_i$ are orthnormal,

$u_i^T u_j = \delta_{ij}, \tag 3$

whence

$A u_j = \displaystyle \left ( \sum_{i = 1}^n c_i u_i u_i^T \right ) u_j = \sum_{i = 1}^n c_i u_i u_i^Tu_j = \sum_{i = 1}^n c_iu_i \delta_{ij} = c_j u_j, \tag 4$

which shows that $c_j$ is an eigenvalue of $A$ with associated eigenvector $u_j$, $1 \le j \le n$.