We know that the eigendecomposition of a square matrix $C\in\mathbb{R}^{n\times n}$, yields the following-
$$C=V\Lambda V^T$$
where
$V\in\mathbb{R}^{n\times n}$ is the matrix of eigen-vectors of $C$
$\Lambda\in\mathbb{R}^{n\times n}$ is the diagonal matrix containing eigen-values of $C$
Since the eigen-vectors happen to form an orthonormal basis, the product $V^TV=I_n$. This lets us conclude that $V^T$ is the left inverse of $V$. I have seen canonically $V^T$ being written interchangeably with $V^{-1}$, indicating that it's the right inverse of $V$ as well.
$$VV^T=\sum_{i=1}^nv_iv_i^T=I_n$$
where $v_i$ is the i$^{\text{th}}$ column vector of $V$ or eigen-vector of $C$.
I don't see why the above condition has to always hold true.
$v_i^Tv_i=1,~v_i^Tv_j=0:i\neq j$ immediately yields $V^TV=I_n$, but I'm struggling to see why this implies that the sum of their outer-products also has to equal the Identity matrix.
I can prove it mathematically using $Cv_i=\lambda_iv_i\implies CV=V\Lambda\implies V^{-1}=V^T$, but was looking for a proof that ties it to the orthonormality of the eigen-vectors.
Matrix product, usually defined as matrix of inner product of column vectors, coincide with outer sum of outer product (matrices) of column vectors. So you said
The sum of outer-products(of orthonormal basis) equals to matrix of inner products of orthonormal basis. Matrix of inner products of orthonormal basis, is the identity matrix.
Is this answer satisfies you?