Definition : If a n by n matrix $A$ is orthogonally congruent to another matrix $B$, then there exist an orthogonal matrix $C$ such that $$A = C^{-1}BC$$
Theorem: If $A$ is symmetric, then $A$ is orthogonally congruent to a diagonal matrix, $B = diag (\lambda_1 , .....,\lambda_n)$ where $\lambda_1,....\lambda_n$ are the (necessarily real) eigenvalues of $A$ (with multiple eigenvalues repeated in accordance with their multiplicities)
Why is this true?? If $A$ has n distinct eigenvalues its obvious that this is true, for we can have, $$C = col (\bf{v_1},v_2,....v_n)$$ where $\bf{v_1,v_2...,v_n}$ are all the eigenvectors associated with the eigenvalues, and $$ AC = CB \\ B = diag(\lambda_1,....\lambda_n) \\ A = C^{-1}BC$$ But if we do the same thing for the case where the eigenvalues are repeated, then wouldnt $C$ have multiple columns of the linearly dependent vectors? (If we use the repeated eigenvalues to solve for the eigenvectors wouldnt they be the same?) If so, $det(C) = 0$ then how isit still possible that $$A = C^{-1}BC$$ when $C^{-1}$ doesnt exist??
There's a few points you're missing here:
Hopefully that clears some things up. This, of course, is not a proof, but presumably this is proved in your textbook.