So this question is in a way more about computation than theory, because I feel pretty confident in the latter but yet can't get the former to work.
What I seek is
given $\begin{pmatrix} \cos(\alpha) & \sin(\alpha) \\ \sin(\alpha) & -\cos\alpha) \end{pmatrix}$ as the matrix of linear operator $\mathbb{A}$ in arbitrary basis $e_1,e_2$, find the matrix of $\mathbb{A}$ in an orthonormal eigenbasis $f_1,f_1$. (For full context I am in a Euclidean vector space.)
Now at first this didn't seem too bad. I proceeded to solve the characteristic polynomial, getting eigenvalues $\lambda = \pm 1$ then $\begin{pmatrix} \cos(\alpha) & \sin(\alpha) \\ \sin(\alpha) & -\cos\alpha) \end{pmatrix} - \lambda I)\begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = 0$ gives $(\cot(\alpha) - \csc(\alpha), 1)$. and $(\cot(\alpha) + \csc(\alpha),1)$ as eigenvectors, which by inspection are orthogonal.
Here the trouble begins, we normalize by way of the the cumbersome factors $c_1=\sqrt{\lvert{\cot(x) - \csc(x)}\rvert^2 + 1}, c_2=\sqrt{\lvert{\cot(x) + \csc(x)}\rvert^2 + 1}$. These factors now essentially make the computations outrageously complicated, because the change of basis matrices $$C = \begin{pmatrix} \frac{1}{c_1}(\cot(\alpha) - \csc(\alpha)) & \frac{1}{c_2}(\cot(\alpha) + \csc(\alpha)) \\ \frac{1}{c_1} & \frac{1}{c_2} \end{pmatrix} $$
$$C^{-1} = \begin{pmatrix} \frac{-c_1\sin(\alpha)}{2} & c_1 \cos^2(\frac{\alpha}{2}) \\ \frac{c_2\sin(\alpha)}{2} & c_2 \sin^2(\frac{\alpha}{2})\end{pmatrix}$$
which I would use to find $A'=C^{-1}AC$ create an absolutely terrible mess, which worst of all even after slogging through them does not seem to be a diagonal matrix, this being my fundamental issue as I know from the theory our matrix should be diagonal in an orthonormal eigenbasis.
Thus my fundamental question is about a better way to approach these computations? I know from checking without normalizing that the result of using $C,C^{-1}$ as I desired without the factors $c_1,c_2$ is relatively straightforward and is a nice diagonal matrix of the form $\begin{pmatrix} -1 & 0 \\ 0 & 1 \end{pmatrix}$, so much nicer without normalizing! In this way everything would be so much simpler if I just avoided normalizing, or only normalized the final matrix $A'$, but I can't really justify doing this considering my task is the to find $A'$ in an orthonormal eigenbasis.
Thanks very much for any assistance!
Write $A: \mathbb{R}^2 \to \mathbb{R}^2$ out as the matrix in the standard basis: $$ A = \begin{pmatrix} \cos \alpha & \sin \alpha \\ \sin \alpha & -\cos \alpha \end{pmatrix} $$
You've already found two eigenvectors $v_1, v_2$ with eigenvalues $1$ and $-1$ respectively (I multiplied through by $\sin \alpha$ since I liked the result more):
$$ v_1 = \begin{pmatrix} \cos \alpha + 1 \\ \sin \alpha \end{pmatrix}, \quad v_2 = \begin{pmatrix} \cos \alpha - 1 \\ \sin \alpha \end{pmatrix}$$
Since we have $Av_1 = 1 v_1 + 0 v_2$ and $A v_{2} = 0 v_1 + (-1) v_2$, the columns of the linear operator $A$ in the $v_1, v_2$ basis must be $(1, 0)$ and $(0, -1)$ respectively. So:
$$ [A]_{\{v_1, v_2\}} = \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix} $$
Lets say you normalise these eigenvectors to obtain a new basis $w_1, w_2$ where $w_1 = c_1 v_1$ and $w_2 = c_2 v_2$. Then in the $w_1, w_2$ basis, the action of $A$ looks exactly the same: $Aw_1 = w_1 + 0 w_2$ and $Aw_2 = 0 w_1 - w_2$ (check this does actually work, from what you know about $Av_1$ and $Av_2$). And so
$$ [A]_{\{w_1, w_2\}} = \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix} $$
So renormalising genuinely does not affect the resulting matrix.
While it is true that you could write down the change of basis matrix from the $w_1, w_2$ to the standard basis $e_1, e_2$, and conjugate the original matrix $A$ by it, and get the same answer, in practice is wildly impractical, especially if you already have an eigenbasis, since you know the resulting matrix must be diagonal.