I found a question from an old exam which I am not really able to wrap my head around. The questions states:
Given $k<n$ and $v_1, v_2, ..., v_k \in \mathbb{R} ^n$ non-zero vectors, orthogonal to the standard dot product on $\mathbb{R}^n$, where we consider the vectors of $\mathbb{R}^n$ as column vectors. Given that $\lambda_1, \lambda_2, ..., \lambda_k \in \mathbb{R}$ and $$A = \lambda_1 v_1 v_1^T + \lambda_2 v_2 v_2^T + ... + \lambda_k v_k v_k^T \in \mathbb{R}^{n \times n}$$
Part A asks to prove that $v_1, v_2, ..., v_k $ are eigenvectors of $A$. I managed to prove this by showing that $$Av_i = \left(\sum_{j=1}^k\lambda_jv_jv_j^T\right)v_i=\sum_{j=1}^k\lambda_jv_j(v_j^Tv_i)=\lambda_iv_i|| v_i||^2=\left(\lambda_i|| v_i||^2\right)v_i$$ $\forall i = 1, 2, ..., k$
Part B asks to prove that $\dim(E_0) \geq n - k$, where $E_0$ the eigenspace is for the eigenvalue $0$. I don't really have a direct idea of how to get started with this part.
Part C asks to prove that $A$ is diagonalisable and give an orthogonal basis of $A$. Again I am not really sure how to get started with this.
Any help is appreciated.
For Part B, you can find an orthogonal system $$v_{k+1},v_{k+2},\ldots, v_n\ ,$$ which is orthogonal to $\{v_1,v_2,\ldots, v_k\}$ (e.g. by Gram-Schmidt). We can see $Av_{i}=0$ for $k<i\le n$. From this, deduce that $\dim E_0\ge n-k$.
Combining the Part A and B, we can see immediately there exists an orthogonal basis consisting of eigenvectors of $A$.
Note: Spectral decomposition theorem says every symmetric real matrix is of the same form as that of $A$.