Orthonormal Eigenbasis of the reflection matrix

1k Views Asked by At

So this question is in a way more about computation than theory, because I feel pretty confident in the latter but yet can't get the former to work.

What I seek is

given $\begin{pmatrix} \cos(\alpha) & \sin(\alpha) \\ \sin(\alpha) & -\cos\alpha) \end{pmatrix}$ as the matrix of linear operator $\mathbb{A}$ in arbitrary basis $e_1,e_2$, find the matrix of $\mathbb{A}$ in an orthonormal eigenbasis $f_1,f_1$. (For full context I am in a Euclidean vector space.)

Now at first this didn't seem too bad. I proceeded to solve the characteristic polynomial, getting eigenvalues $\lambda = \pm 1$ then $\begin{pmatrix} \cos(\alpha) & \sin(\alpha) \\ \sin(\alpha) & -\cos\alpha) \end{pmatrix} - \lambda I)\begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = 0$ gives $(\cot(\alpha) - \csc(\alpha), 1)$. and $(\cot(\alpha) + \csc(\alpha),1)$ as eigenvectors, which by inspection are orthogonal.

Here the trouble begins, we normalize by way of the the cumbersome factors $c_1=\sqrt{\lvert{\cot(x) - \csc(x)}\rvert^2 + 1}, c_2=\sqrt{\lvert{\cot(x) + \csc(x)}\rvert^2 + 1}$. These factors now essentially make the computations outrageously complicated, because the change of basis matrices $$C = \begin{pmatrix} \frac{1}{c_1}(\cot(\alpha) - \csc(\alpha)) & \frac{1}{c_2}(\cot(\alpha) + \csc(\alpha)) \\ \frac{1}{c_1} & \frac{1}{c_2} \end{pmatrix} $$

$$C^{-1} = \begin{pmatrix} \frac{-c_1\sin(\alpha)}{2} & c_1 \cos^2(\frac{\alpha}{2}) \\ \frac{c_2\sin(\alpha)}{2} & c_2 \sin^2(\frac{\alpha}{2})\end{pmatrix}$$

which I would use to find $A'=C^{-1}AC$ create an absolutely terrible mess, which worst of all even after slogging through them does not seem to be a diagonal matrix, this being my fundamental issue as I know from the theory our matrix should be diagonal in an orthonormal eigenbasis.

Thus my fundamental question is about a better way to approach these computations? I know from checking without normalizing that the result of using $C,C^{-1}$ as I desired without the factors $c_1,c_2$ is relatively straightforward and is a nice diagonal matrix of the form $\begin{pmatrix} -1 & 0 \\ 0 & 1 \end{pmatrix}$, so much nicer without normalizing! In this way everything would be so much simpler if I just avoided normalizing, or only normalized the final matrix $A'$, but I can't really justify doing this considering my task is the to find $A'$ in an orthonormal eigenbasis.

Thanks very much for any assistance!

2

There are 2 best solutions below

2
On BEST ANSWER

Write $A: \mathbb{R}^2 \to \mathbb{R}^2$ out as the matrix in the standard basis: $$ A = \begin{pmatrix} \cos \alpha & \sin \alpha \\ \sin \alpha & -\cos \alpha \end{pmatrix} $$

You've already found two eigenvectors $v_1, v_2$ with eigenvalues $1$ and $-1$ respectively (I multiplied through by $\sin \alpha$ since I liked the result more):

$$ v_1 = \begin{pmatrix} \cos \alpha + 1 \\ \sin \alpha \end{pmatrix}, \quad v_2 = \begin{pmatrix} \cos \alpha - 1 \\ \sin \alpha \end{pmatrix}$$

Since we have $Av_1 = 1 v_1 + 0 v_2$ and $A v_{2} = 0 v_1 + (-1) v_2$, the columns of the linear operator $A$ in the $v_1, v_2$ basis must be $(1, 0)$ and $(0, -1)$ respectively. So:

$$ [A]_{\{v_1, v_2\}} = \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix} $$

Lets say you normalise these eigenvectors to obtain a new basis $w_1, w_2$ where $w_1 = c_1 v_1$ and $w_2 = c_2 v_2$. Then in the $w_1, w_2$ basis, the action of $A$ looks exactly the same: $Aw_1 = w_1 + 0 w_2$ and $Aw_2 = 0 w_1 - w_2$ (check this does actually work, from what you know about $Av_1$ and $Av_2$). And so

$$ [A]_{\{w_1, w_2\}} = \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix} $$

So renormalising genuinely does not affect the resulting matrix.

While it is true that you could write down the change of basis matrix from the $w_1, w_2$ to the standard basis $e_1, e_2$, and conjugate the original matrix $A$ by it, and get the same answer, in practice is wildly impractical, especially if you already have an eigenbasis, since you know the resulting matrix must be diagonal.

0
On

There’s a bit of a problem with the eigenvectors that you’ve computed: they’re undefined for $\alpha=n\pi$, but the matrix is perfectly good for those values of $\alpha$. Since any nonzero multiple of an eigenvector is also an eigenvector with the same eigenvalue, multiply both of them by $\sin\alpha$ to eliminate the troublesome denominator. You now have the eigenvectors $(\cos\alpha\pm1,\sin\alpha)^T$, which are defined for all values of $\alpha$. Happily, doing this makes normalization much simpler, too.

We now have $$(1+\cos\alpha)^2+\sin^2\alpha = 1+2\cos\alpha+\cos^2\alpha+1+\sin^2\alpha = 2(1+\cos\alpha)$$ and $${1+\cos\alpha \over \sqrt{2(1+\cos\alpha)}} = \sqrt{{1+\cos\alpha\over2}} = \pm\cos{\frac\alpha2}, \\ {\sin\alpha \over \sqrt{2(1+\cos\alpha)}} = \sqrt{{1-\cos\alpha \over 2}} = \pm\sin{\frac\alpha2}.$$ Note that I don’t have the absolute value as you do in your norms. It’s unnecessary since $|x|^2=x^2$. The correct combination of signs can be determined by examining $\alpha=\pi/2$. For the other unit eigenvector, you could go through a similar computation, but the matrix is symmetric, so you know that the eigenspaces of different eigenvalues are orthogonal, therefore the other unit eigenvector is of the form $(\pm\sin(\alpha/2),\pm\cos(\alpha/2))^T$, with signs chosen appropriately.

All symmetric real matrices are orthogonally diagonalizable, so you could have proceeded directly to the diagonalization: $$C^TAC = \begin{bmatrix}\cos\theta & \sin\theta \\ -\sin\theta & \cos\theta \end{bmatrix} \begin{bmatrix}\cos\alpha&\sin\alpha\\\sin\alpha&-\cos\alpha\end{bmatrix} \begin{bmatrix}\cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix} = \begin{bmatrix} \cos(\alpha-2\theta) & \sin(\alpha-2\theta) \\ \sin(\alpha-2\theta) & -\cos(\alpha-2\theta) \end{bmatrix}.$$ You could instead start with a reflection as the eigenbasis matrix $C$. It represents the same eigenspaces and since it’s its own inverse you don’t have to remember on which side of $A$ the inverse goes, but a rotation is more conventional. The off-diagonal terms vanish when $\theta=\alpha/2$, for which $C^TAC=\operatorname{diag}(1,-1)$.