I have been studying the spectral decomposition of the matrices and figured out that it works for symmetric matrices but it wont work for the skew symmetric ones well, the sign of the final matrix is wrong.
The spectral decomposition is defined: $A = \lambda _1q_1q^T_1+...+\lambda _nq_nq^T_n$, $\;$ where A is the symmetric matrix, $\lambda _n$ are its eigenvalues and $q_n$ are its eigenvectors which form an orthonormal basis.
Well, that there would be a problem with the symmetry is visible from the outer product of the eigenvectors.
So how could I make the spectral decomposition for the skew symmetric matrix? I did not find anything nice on the internet for my question.
(PS: The skew symmetric matrix I have been dealing with had complex eigenvalues, if there is any connection)
I'll assume you are dealing with real matrices.
A real skew-symmetric matrix $A$ is skew-Hermitian and hence $iA$ is Hermitian. Now the spectral theorem for Hermitian matrix applies. So $A=\sum_k i\theta_k e_k e_k^\dagger$.