Find eigenvalues and eigenvectors of the operator $A$

1.5k Views Asked by At

The question is: Find the eigenvalues and eigenvectors of the operator $A$ on $\Bbb{R}^3$ given by $A\mathbf{x}=|\mathbf{a}|^2 \mathbf{x}- (\mathbf{a} \cdot \mathbf{x}) \mathbf{a}$, where $\mathbf{a}$ is a given constant vector. How do you know without any calculations that $A$ must have an orthonormal eigenbasis?

I have seem examples similar to this question. I'm wondering if there's any systematic way to solve this kind of questions. Someone showed to me that you first get $\mathbf{x}=\lambda\mathbf{a}$. What is the reason behind that and how does this help to solve the problem?

4

There are 4 best solutions below

2
On BEST ANSWER

Notice that $A$ is a symmetric matrix. Because $A_{ij}=(Ae_{j}\cdot e_{i})=|a|^{2}(e_{j}\cdot e_{i})-(a\cdot e_{j})(a\cdot e_{i})=(Ae_{i}\cdot e_{j})=A_{ji}$. So it is diagonalizable and has an orthonormal eigen basis.

Intuition behind the eigen vectors: Notice that $A'x:=(a\cdot x)a$ is a projection matrix projecting every vector along the direction of $a$. Modify a little bit and define $A''x:=\frac{1}{|a|^{2}}(a\cdot x)a$. Then $A''$ is an orthogonal projection, and $x-\frac{1}{|a|^{2}}(a\cdot x)a$ is the vector orthogonal to the direction of $\frac{1}{|a|^{2}}(a\cdot x)a$.

(Geometrically) It is clear that $a$ is an eigen vector of $A''$ corresponding to the eigenvalue $1$. Another eigenvalue of $A''$ is $0$ and eigenvectors are the vectors which are orthogonal to $a$, let's call them $a_{1}^{\bot}$ and $a_{2}^{\bot}$.

Now if we look at $I-A''$, then it is a projection onto the space generated by $a_{1}^{\bot},a_{2}^{\bot}$. In fact, $(I-A'')a_{i}^{\bot}=a_{i}^{\bot}$ for $i=1,2$ and $(I-A'')a=0$, i.e., $a_{i}^{\bot}$ s are the eigenvectors of $(I-A'')$ corresponding to the eigenvalue $1$, and $a$ is an eigenvector of $(I-A'')$ corresponding to the eigenvalue $0$.

In our case $A=|a|^{2}(I-A'')$. Using the above intuition we find the eigenvalues of $A$ are $|a|^{2}, 0$ and the corresponding eigenvectors are $a_{1}^{\bot}, a_{2}^{\bot}$ and $a$ respectively.

2
On

With matrix multiplication notation you get $$Ax = (a^Ta)x-(aa^T)x = ((a^Ta)I-aa^T)x$$

so $$A = (a^Ta)I-aa^T$$

where you can see that $A$ is symetric and therefore has orthogonal eigenvectors (look up the property on wikipedia).

Eigenvalues $\lambda$ fulfill

$$Ax = \lambda x$$

or in your case

$$((a^Ta - \lambda)I-aa^T)x = 0 \ .$$

You can find the eigenvalues by computing the roots of the polynomial

$$\text{det}((a^Ta - \lambda)I-aa^T) = 0 \ .$$

2
On

Note that for all $x,y\in\mathbb{R}^3,$ $$\langle A x,y\rangle=\langle|a|^2x-\langle a, x\rangle a,y\rangle=|a|^2\langle x,y\rangle-\langle a,x\rangle\langle a,y\rangle=\langle x,Ay\rangle,$$in other words $A$ is self adjoint, hence it has an orthonormal eigenbasis.

In fact, a simple calculation shows that $a$ is an eigenvector, and so is every vector perpendicular to $a$.

2
On

This is trivial if $\mathbf{a}=\mathbf{0}$, so we can assume $\mathbf{a}$ is not zero. Let's compute $$ A\mathbf{a}= |\mathbf{a}|^2\mathbf{a}-(\mathbf{a}\cdot\mathbf{a})\mathbf{a}= \mathbf{0} $$ which shows $\mathbf{a}\in\ker A$. On the other hand, if $\mathbf{x}\perp\mathbf{a}$, then $$ A\mathbf{x}=|\mathbf{a}|^2\mathbf{x} $$ so $|\mathbf{a}|^2$ is an eigenvalue of $A$ with geometric multiplicity $2$.

The eigenvectors relative to $0$ and $|\mathbf{a}|^2$ are mutually orthogonal, so we have an orthogonal basis made of eigenvectors.