I am reading in a textbook that for a symmetric matrix A, we can write
$$ A = P\Lambda P^T = \sum_i \vec{e_i}\vec{e_i}^T\lambda_i $$
where $\vec{e_i}$ are the eigenvectors of the matrix, $P$ is an orthogonal matrix with $\vec{e_i}$ as the columns, and $\lambda_i$ are the corresponding eigenvalues.
However, I am not seeing why the second equivalence is true, i.e., why that summation is the same as the matrix product $P\Lambda P^T$. I have tried writing out the matrices on a piece of paper and looking at it element-wise but it is a little too convoluted for me to make anything of it.
Consider vectors of the standard basis, i.e., a vector $\vec i$ whose entries are one at index $i$ and zeroes elsewhere. A product $\vec i \ \vec j {}^{\rm T}$ gives you a matrix whose entries are one at location $(i,j)$ and zeroes elsewhere. Then consider matrix $\Lambda$ (which is diagonal) as a sum of such product matrices (scalated up to eigenvalues): $$\Lambda =\sum_i \vec i \ \vec i {}^{\rm T} \lambda_i.$$ Then $$P \Lambda P^{\rm T} =P \Big(\sum_i \vec i \ \vec i {}^{\rm T} \lambda_i\Big) P^{\rm T} =\sum_i P \vec i \ \vec i {}^{\rm T} P^{\rm T}\lambda_i =\sum_i (P \vec i) (P \vec i) {}^{\rm T} \lambda_i. $$ A product of the form $P \vec i$ actually gives the $i$th column of $P$, which is $\vec e_i$.