Recently, I read a proposition in a lecture notes that says the following:
Proposition. Let $a,b,c,d \in \mathbb{R}$, then there exist $\alpha, \beta, r, s \in \mathbb{R}$ such that
$$ \begin{bmatrix}{a}&{b}\\{c}&{d}\end{bmatrix}= \begin{bmatrix}{cos(\beta)}&{-sin(\beta)}\\{sin(\beta)}&{cos(\beta)}\end{bmatrix} \begin{bmatrix}{r}&{0}\\{0}&{s}\end{bmatrix} \begin{bmatrix}{cos(\alpha)}&{-sin(\alpha)}\\{sin(\alpha)}&{cos(\alpha)}\end{bmatrix} $$
At the outset, I didn't know how to proceed. However, I read the following theorem in Sheldon Axler's book, namely, "Linear algebra done right":
Theorem. Let $V$ be an $n$-dimensional inner product vector space. Suppose $T$ is an endomorphism of $V$ with singular values $s_1, \ldots, s_n$. Then, there exist orthonormal bases $e_1, \ldots, e_n$ and $f_1,\ldots,f_n$ of $V$such that $$ Tv=s_1 \langle v, e_1 \rangle f_1 + \ldots + s_n \langle v, e_n \rangle f_n $$ for every $v \in V$.
This led me to the following attempt: Let's denote by $T$ the endomorphism associated with the left matrix of the proposition. Then, if $s_1$ and $s_2$ are the singular values of $T$, by the above theorem we know that there exist orthonormal bases $\beta=\left\{{e_1,e_2}\right\}$ and $\gamma=\left\{{f_1,f_2}\right\}$ such that $$ [T]_{\beta}^{\gamma}=\begin{bmatrix}{s_1}&{0}\\{0}&{s_2}\end{bmatrix} $$ where $[T]_{\beta}^{\gamma}$ is matrix of $T$ with respect to the bases $\beta$ and $\gamma$, that is, $T(e_1)=s_1f_1$ and $T(e_2)=s_2f_2$. Let $\alpha$ be the canonical basis of $\mathbb{R}^2$, $P$ the change-of-basis matrix from $\beta$ to $\alpha$ and $Q$ the change-of-basis matrix from $\gamma$ to $\alpha$. Then, one has that $$ \begin{bmatrix}{a}&{b}\\{c}&{d}\end{bmatrix}=[T]_{\alpha}^{\alpha}=Q^{-1} [T]_{\beta}^{\gamma}P $$ Now, since $\beta$ and $\gamma$ are orthonormal bases, it should then be the case that $P$ and $Q$ are orthogonal, and therefore they have the form $$ \begin{bmatrix}{cos(\theta)}&{(-1)^{k+1}sin(\theta)}\\{sin(\theta)}&{(-1)^kcos(\theta)}\end{bmatrix} $$ for some integer $k$. The proposition claims that we can assure that $k$ is even, why? Is not clear at all for me.
In advance thank you.
For the purpose in this question, even if $P$ "has odd $k$":
$$\begin{align*} P&=\begin{bmatrix}\cos\theta&\sin\theta\\ \sin\theta&-\cos\theta\end{bmatrix}\\ &=\begin{bmatrix}1&0\\ 0&-1\end{bmatrix}\begin{bmatrix}\cos\theta&\sin\theta\\ -\sin\theta&\cos\theta\end{bmatrix}\\ &=\begin{bmatrix}1&0\\ 0&-1\end{bmatrix}\begin{bmatrix}\cos(-\theta)&-\sin(-\theta)\\ \sin(-\theta)&\cos(-\theta)\end{bmatrix}\\ \begin{bmatrix}s_1&0\\0&s_2\end{bmatrix} P &=\begin{bmatrix}s_1&0\\0&-s_2\end{bmatrix} \begin{bmatrix}\cos(-\theta)&-\sin(-\theta)\\ \sin(-\theta)&\cos(-\theta)\end{bmatrix}\\ \end{align*}$$
So by choosing a different $s_2$ and $\theta$, the new orthogonal matrix replacing $P$ can "have even $k$" and be a rotation matrix.
The case for $Q$ is similar,
$$\begin{align*} Q&=\begin{bmatrix}1&0\\0&-1\end{bmatrix} \begin{bmatrix}\cos(-\theta)&-\sin(-\theta)\\ \sin(-\theta)&\cos(-\theta)\end{bmatrix}\\ Q^{-1}&= \begin{bmatrix}\cos(-\theta)&-\sin(-\theta)\\ \sin(-\theta)&\cos(-\theta)\end{bmatrix}^{-1} \begin{bmatrix}1&0\\0&-1\end{bmatrix}\\ Q^{-1}\begin{bmatrix}s_1&0\\0&s_2\end{bmatrix} &= \begin{bmatrix}\cos\theta&-\sin\theta\\ \sin\theta&\cos\theta\end{bmatrix} \begin{bmatrix}s_1&0\\0&-s_2\end{bmatrix}\\ \end{align*}$$