The spectral decomposition of a positive definite matrix $A=\sum_k \lambda_kv_k v^\dagger_k$ where $\lambda_k>0$ and $v_k$ are orthonormal vectors. How do we prove that the set $\{v_kv^\dagger_l\}$ are linearly independent.
If $A$ is $n\times n$ then there are $n$ linearly independent and orthonormal eigenvectors $v_k$. There $n\times n$ members in the set $\{v_kv^\dagger_l\}$ and there must be $n\times n$ members in the basis of the matrix space.
i.e., We have to prove that the set $\{v_kv^\dagger_l\}$ form a basis for the $n\times n$ matrix space.
Note: I am not asking to prove this by stating,
$\sum_{kl} c_{kl}v_kv^\dagger_l=0$ iff $c_{kl}=0$ for all $k,l$.
Original Context
___________________________________________________________________
___________________________________________________________________
My Attempt
Theorem 1 : If $v_k$ and $w_k$ are basis of the vectorspaces $V$ and $W$ respectively, then $v_i\otimes w_j$ is a basis of the vectorspace $V\otimes W$.
where $\otimes$ respresent a tensor product between vectors.
In the case of column vectors $(v_i\otimes_{outer}w_j)$ the tensor product can be seen as a form of flattening of the outer product, i.e., there is a one-to-one correspondence between the vectorspaces $V\otimes W$ and $V\otimes_{outer} W$.
i.e., $V\otimes W$ is isomorphic to $V\otimes_{outer} W\implies V\otimes W\cong V\otimes_{outer} W$
Theorem 2 : If there is an isomorphism $T:V\to W$. The vectors $\{v_i\}$ form a basis for $V$ iff $T(v_i)$ form a basis for $W$.
$v_i\otimes w_j$ form a basis for the vectorspace $V\otimes W$, from theorem, $v_i\otimes_{outer} w_j$ form a basis for the vectorspace $V\otimes_{outer} W$.
i.e., $v_i\otimes_{outer} w_j$ are linearly independent
