In the appendix of a book on linear and nonlinear optimization, I saw the following statement on symmetric matrices:
In these notes, $A$ is considered as a $n \times n$ square matrix and $E^n$ is the $n$ dimensional Euclidean space. I already know that the repeated eigenvalues of a symmetric matrix corresponds to linearly independent eigenvectors, but they are not necessarily orthogonal. Then how can be the third statement correct, if $A$ has repeated eigenvalues? I think this is a misstatement, but I wonder if there is something I am missing.

I understood what I was missing, so I am writing a self answer. Let a eigenvalue $\lambda$'s multiplicity be $m$ for a symmetrix matrix $A$. We have $m$ eigenvectors $x_1, \dots, x_m$, corresponding to this eigenvalue and we know that they are linearly independent. We can produce orthogonal vectors $v_1, \dots, v_m$ in the subspace spanned by $x_1, \dots, x_m$, which are all linear combinations of $x_1, \dots, x_m$. (Gram Schmidt Orthogonalization, for example). $v_1, \dots, v_m$ are still eigenvectors for $A$. Pick $v_i$ among them, for example. We have $v_i = \sum_{j=1}^{m}a_jx_j$ and not all $a_j=0$. Then we have $Av_i= A(\sum_{j=1}^{m}a_jx_j) = \sum_{j=1}^{m}a_jAx_j = \sum_{j=1}^{m}a_j\lambda x_j = \lambda v_i$.