If $\{u_1, \cdots, u_n\}$ is an orthonormal basis for $\mathbb{R}^n$, and if $A$ can be expressed as $$A = c_1u_1u_1^T + \cdots + c_nu_nu_n^T$$ then $A$ is symmetric and has eigenvalues $c_1, \cdots, c_n$.
I'm trying to prove this. Here's what I have so far.
I figure I need to show:
- $A$ is symmetric. I can achieve this by showing that $A$ has an orthonormal set of $n$ eigenvectors (or equivalently, that $A$ is orthogonally diagonalizable). If $P$ orthogonally diagonalizes $A$ then $D = P^TAP \equiv A = PDP^T$. $(PDP^T) = (PDP^T)^T$ by trivial manipulations, knowing that $D$ is diagonal and thus $D^T = D$.
- $c_1, \cdots, c_n$ are the eigenvalues of $A$.
I think both of these would be satisfied if I could show that $c_1u_1u_1^T + \cdots + c_nu_nu_n^T$ was equivalent to $PDP^T$ for the orthogonal matrix $P$ and a $D$ such that $D_{ij} = \begin{cases}0 & i \neq j\\c_i & i = j\end{cases}$.
If $P$ was an orthogonal matrix such that $P = \begin{bmatrix} u_1 & \cdots & u_n\end{bmatrix}$ where $u_j$ was an eigenvector of $A$ then it would also be a basis for $\mathbb{R}^n$, since we'd have $n$ linearly independent vectors. If I had this then I believe you can do the tedious matrix multiplication and get $PDP^T$ given the $D$ defined above and receive $A = c_1u_1u_1^T + \cdots + c_nu_nu_n^T$. Then I'd be done.
But to me the question implies that any orthonormal basis for $\mathbb{R}^n$ would satisfy this. Perhaps I need to show that if $A$ can be expressed with those basis vectors then those basis vectors must be the eigenvectors of $A$. I'm kind of stuck on this part though!
Edit: To be clear: I have outlined here my approach to the proof and what I know to be true. I'm ultimately stuck on how to prove the quoted question. I am asking how one can prove this.
This is exercise 7.2.26 of Anton and Rorres' Elementary Linear Algebra, 11th ed.
I'm not quite sure what you're asking in your question, but if its helpful, here's how I would write this proof.
1) If $$A=\sum_{i=1}^n c_iu_iu_i^T,$$then observe that $$A^T=\left(\sum_{i=1}^nc_iu_iu_i^T\right)^T=\sum_{i=1}^n c_i(u_i^T)^Tu_i^T=\sum_{i=1}^n c_iu_iu_i^T=A,$$ where the second equality follows since taking transposes reverses the order of multiplication for matrices, and we can always pull constants out front.
2) If $A$ has the form above, then to show $c_j$ is an eigenvalue, consider the following product: $$Au_j= \sum_{i=1}^nc_iu_iu_i^Tu_j=\sum_{i=1}^nc_iu_i\delta_{ij}=c_ju_j.$$ The second equality follows from the fact that the $u_i$ form an orthonormal basis so $u_i^Tu_j=\delta_{ij}$ (by definition of orthonormal).