Prove linear operator is a reflection

1.2k Views Asked by At

Prove that a linear operator on $\mathbb{R}^2$ is a reflection if and only if its eigenvalues are $1$ and $-1$ and the eigenvectors with these eigenvalues are orthogonal.

$\Rightarrow$: Let $r: \mathbb{R}^2\ \rightarrow \mathbb{R}^2$ such that $\forall x= \begin{pmatrix} x_1 \\ x_2\end{pmatrix}, r(x)= \begin{pmatrix} \cos \phi & \sin \phi \\ \sin \phi & - \cos \phi \end{pmatrix} \begin{pmatrix} x_1 \\ x_2\end{pmatrix}$

Let $\lambda \in \mathbb{R}$.

$\det\begin{pmatrix} \cos \phi- \lambda & \sin \phi \\ \sin \phi & - \cos \phi- \lambda \end{pmatrix}=0 $

and after calculation we find that $\lambda= ±1$

But now I am having some trouble.

Let $\lambda=1$

$\begin{pmatrix} \cos \phi- 1 & \sin \phi \\ \sin \phi & - \cos \phi- 1\end{pmatrix} \begin{pmatrix} x_1 \\ x_2\end{pmatrix}=0$

$\Leftrightarrow \begin{cases} (\cos \phi-1)x_1+\sin \phi x_2=0 \\\sin \phi x_1-(\cos\phi+1)x_2=0\end{cases}$

I don't seem to see what $x= \begin{pmatrix} x_1 \\ x_2\end{pmatrix}$ works.

Another question that I have is the following: if I find the two eigenvectors associated with each eigenvalue, which matrix do I have to verify that it is orthogonal. Since the eigenvectors found in each case are of order $2 \times 1$, they can't be orthogonal. So I suppose that the matrix which I have to verify is the one whose first column is constituted of the eigenvector associated with $\lambda=1$ and the second column with $\lambda=-1$.

I haven't tried the converse yet, so please don't give me a hint on that.

2

There are 2 best solutions below

0
On BEST ANSWER

If you find non-$0$ vectors $x,y\in\Bbb R^2$ such that $r(x)=x$ and $r(y)=-y,$ then to verify that they are orthogonal, you need only take their dot product.

As for actually finding them, proceed by cases. If $\sin\phi=0,$ then you shouldn't have any trouble, so suppose not. Solving your first equation for $x_2$ gives $$x_2=\frac{1-\cos\phi}{\sin\phi}x_1,$$ whence substitution into your second equation yields $$x_1\sin\phi-\frac{1-\cos^2\phi}{\sin\phi}x_1=0\\x_1\sin\phi-\frac{\sin^2\phi}{\sin\phi}x_1=0\\0=0.$$ Uninformative, at first glance. What that means, though, is that you have a free variable (in fact, you have to, since the eigenspace associated to $1$ is one-dimensional). Thus, you can take any $x_1$ you like, then put $x_2=\frac{1-\cos\phi}{\sin\phi}x_1,$ and you're set!

Addendum: Personally, I'd let $x_1=\sin\phi,$ for simplicity, for then $x_2=1-\cos\phi.$ In fact, $x=\begin{pmatrix}\sin\phi\\1-\cos\phi\end{pmatrix}$ works even in the case that $\sin\phi=0,$ so that's a nice general solution that lets you out of having to proceed by cases!

0
On

This is an exercise in Artin's Algebra. I'm assuming you know what inner products are.

There is no need to get what the eigenvectors look like. We can do it abstractly, which is more convenient. Note that the matrix for $r$ you wrote down is symmetric. Denote it as $A$. Let $u$ and $v$ be the eigenvectors corresponding to the eigenvalues $\lambda_1=1$ and $\lambda_2=-1$: $$ Au=u,\quad Av=-v. $$

Then you have $$ \lambda_1\langle u,v\rangle=\langle \lambda_1 u,v\rangle=\langle Au,v\rangle=\langle u,Av\rangle=\langle u,\lambda_2v\rangle=\lambda_2\langle u,v\rangle $$

Can you see where the symmetry of $A$ is used? What can you conclude from the formula above?