From my lecture notes, I learned that for a $N\times N$ real symmetric matrix $\mathbf{A}$, it is known that it has a complete set of $N$ orthogonal eigenvectors $\hat{e}^{k}$, with $k=1 \ldots N$ which can be normalised such that: $$ \forall k : \quad \boldsymbol{A} \hat{e}^{k}=\mu_{k} \hat{e}^{k}, \quad \forall k, k^{\prime} : \quad \hat{e}^{k} \cdot \hat{e}^{k^{\prime}}=\delta_{k k^{\prime}} $$
Where $\left\{\mu_{1}, \ldots, \mu_{N}\right\}$ are the $N$ eigenvalues of $\mathbf{A}$ and are not necessarily distinct.
However in my case I deal with a random real matrix $\mathbf{A}$, which is not symmetric. What are the necessary conditions to ensure that there are a complete set of $N$ orthogonal eigenvectors?
My aim is to use the $N$ eigenvectors as a new basis in $\mathbb{R}^N$ and represent any vector by: $$\vec{a}=\sum_{k=1}^{N} \sigma_{k}\hat{e}^{k}$$ for some coefficients $\left\{\sigma_{1}, \ldots, \sigma_{N}\right\}$. To this end I need to be sure that those eigenvectors form indeed a new basis. How can I be sure?
Thanks
A complete set of $N$ orthogonal eigenvectors implies that $A$ is symmetric.
If $A$ has a complete set of $N$ eigenvectors, then it is diagonalizable. In other words, we may write $$ A = QDQ^{-1} $$ where $Q$ is a matrix with the eigenvectors of $A$ as columns, and $D$ is a diagonal matrix with the corresponding eigenvalues of $A$ (repeated as suitable) along the diagonal.
If the eigenvectors of $A$ are all pairwise orthogonal, and we normalize the eigenvectors so that they all have unit length, this makes $Q$ into an orthogonal matrix ($Q^{-1} = Q^T$). Then we have $$ A = QDQ^T $$ and we see that the right-hand side is symmetric. Therefore $A$ must also be symmetric.