It is well known that if n by n matrix A has n distinct eigenvalues, the eigenvectors form a basis.
Also, if A is symmetric, the same result holds.
Consider
$ A =\left[ {\begin{array}{ccc} 1 & 2 & 3 \\ 0 & 1 & 2 \\ 0 & 0 & 1 \\ \end{array}}\right] $ .
This matrix has single eigenvalue $\lambda=1$, and is not symmetric.
But, the eigenvectors corresponding to $\lambda=1$, ($ v_1 =\left[ {\begin{array}{c} 1 \\ 0 \\ 0 \\ \end{array}}\right] $ , $ v_2 =\left[ {\begin{array}{c} 0 \\ 1/2 \\ 0 \\ \end{array}}\right] $ , $ v_3 =\left[ {\begin{array}{c} 0 \\ -3/8 \\ 1/4 \\ \end{array}}\right] $ ) form a basis.
What sufficient conditions offer the above result?
A square matrix is diagonalizable if and only if there exists a basis of eigenvectors. That is, $A$ is diagonalizable if there exists an invertible matrix $P$ such that $P^{-1}AP=D$ where $D$ is a diagonal matrix.
One can show that a matrix is diagonalizable precisely when the dimensions of each eigenspace correspond to the algebraic multiplicity of the corresponding eigenvalue as a root of the characteristic polynomial.
If the dimension of an eigenspace is smaller than the multiplicity, there is a deficiency. The eigenvectors will no longer form a basis (as they are not generating anymore). One can still extend the set of eigenvectors to a basis with so called generalized eigenvectors, reinterpreting the matrix w.r.t. the latter basis one obtains a upper diagonal matrix which only takes non-zero entries on the diagonal and the 'second diagonal'. This is the Jordan normal form which captures the failure of the eigenvectors to form a basis.