I did part of the proof where the eigenvectors corresponds to distinct eignevalues, where at the conclusion we get (λi - λj)xi'xj = 0 which follows that xi'xj = 0, where λi ≠ λj.
I want to know how can we prove that this statement is true for non-distinct eignvalues too
Let's take a mundane example: take the trivially symmetric real matrix $$ A= \begin{bmatrix} 1&0\\ 0&1 \end{bmatrix}.$$ All of $\mathbb{R}^2$ consists of eigenvectors for $A$. Of course, for instance $$v_1= \begin{bmatrix} 1\\ 0 \end{bmatrix},\:\: v_2=\begin{bmatrix} 1\\ 1 \end{bmatrix}$$ are not orthogonal, in spite of belonging to the same eigenspace corresponding to $\lambda=1$. Moreover, $v_1$ and $v_2$ are linearly independent, but still not orthogonal.
However, by Gram-Schmidt, we can obtain an orthonormal basis consisting of eigenvectors.