If $x$ and $y$ are two linearly independent column $n$-vectors, where $n \geq 2$, find all the eigenvalues of $x x^{T} - y y^{T}$
Could you please tell me whether what I have written is correct or not?
Using the identity
\begin{align*} \lambda^n\det(\lambda I_{(m)} - AB) = \lambda^m\det(\lambda I_{(n)} - BA) \end{align*}
for $A \in F^{m \times n}$ and $B \in F^{n \times m}$, we can calculate the characteristic polynomial of $xx^T - yy^T$ by setting $A = (x, y) \in F^{n \times 2}$ and $B = (x^T, -y^T)^T \in F^{2 \times n}$ directly as:
\begin{align*} \varphi(\lambda) &= \det(\lambda I_{(n)} - (xx^T - yy^T)) = \lambda^{n - 2}\det\left(\lambda I_{(2)} - \begin{pmatrix} x^T \\ -y^T \end{pmatrix}\begin{pmatrix} x & y \end{pmatrix}\right) \\ &= \lambda^{n - 2}\begin{vmatrix} \lambda - x^Tx & -x^Ty \\ y^Tx & \lambda + y^Ty \end{vmatrix} \\ &= \lambda^{n - 2}[(\lambda - x^Tx)(\lambda + y^Ty) + (x^Ty)^2] \\ &= \lambda^{n - 2}(\lambda^2 - (x^Tx - y^Ty)\lambda - (x^Txy^Ty - (x^Ty)^2)) \end{align*}
Since $x$ and $y$ are linearly independent, by Cauchy-Schwarz inequality $(x^Tx)(y^Ty) > (x^Ty)^2$ (that is, the equality of C-S inequality cannot hold), whence the determinant $\Delta$ of the quadratic equation $\lambda^2 - (x^Tx - y^Ty)\lambda - (x^Txy^Ty - (x^Ty)^2) = 0$ equals to
\begin{align*} \Delta = (x^Tx - y^Ty)^2 + 4(\|x\|^2\|y\|^2 - (x^Ty)^2) > 0. \end{align*} Hence the two non-zero eigenvalues are two distinct real numbers \begin{align*} \lambda_1 = \frac{y^Ty - x^Tx + \sqrt{\Delta}}{2}, \quad \lambda_2 = \frac{y^Ty - x^Tx - \sqrt{\Delta}}{2}. \end{align*}
This looks correct, except for the sign error in the final formulas, and fact that you could (and do) have $\lambda=0$. Think about it geometrically: $xx^T$ is the orthogonal projection onto $x$ times $x^Tx$, $yy^T$ is the orthogonal projection onto $y$ times $y^Ty$. So everything orthogonal to $x$ and $y$ is mapped to zero (you can also check this algebraically), giving you an $n-2$ dimensional zero eigenspace. Then you can restrict to the plane spanned by $x$ and $y$, effectively reducing to the 2D problem, which gives the two eigenvalues you have derived: namely, in the basis $v_1=x$ and $v_2=y$ we have $(1,0) \to (x^Tx, -y^Tx) $, $(0,1) \to (x^Ty, -y^Ty)$ giving the matrix in your second line.
Moreover, a geometric analysis leads to the following: let $u=x+y, v=x-y$. Then $M=xx^T-yy^T=\frac{1}{2} [ vu^T+uv^T ]$ and let
$w_1=v |u| +u|v|$
$w_2=v|u|-u|v|$
I claim these are the eigenvectors. (The geometry of the problem is that an eigenvector is any vector which after reflection through the line of $u$ along direction of $v$ is rotated by 90 degrees; then drawing a picture reveals that it must be a sum of two vectors of equal length directed along the lines of $u$ and $v$, leading to the above formulas.)
We check
$$ 2M w_1=[ vu^T+uv^T ] (v |u| +u|v|)= v [u^Tv |u| +u^Tu |v| ]+u[v^Tv|u|+v^Tu|v| ]= $$ $$v |u| [u^Tv+|u||v|]+u|v|[|v||u|+v^Tu]= [|v||u|+v^Tu] w_1$$
Similarly
$$ 2M w_2=[ vu^T+uv^T ] (v |u| -u|v|)= v [u^Tv |u| -u^Tu |v| ]+u[v^Tv|u|-v^Tu|v| ]$$ $$= v |u| [u^Tv-|u||v|]-u|v|[-|v||u|+v^Tu]= [-|v||u|+v^Tu] w_2$$
Thus the eigenvalues are $\lambda_1=\frac{1}{2} [|v||u|+v^Tu] $ and $\lambda_2=\frac{1}{2}[ -|v||u|+v^Tu]$.
You can check their sum is $\lambda_1+\lambda_2=x^Tx-y^Ty$, matching the trace of your matrix, and their product is
$\frac{1}{4}[(v^Tu)^2-|u|^2|v|^2]= \frac{1}{4}[(|x|^2-|y|^2)^2-(|x|^2+|y|^2)^2- (2x^Ty)^2]=|x|^2|y|^2- (x^Ty)^2$
matching the determinant of your matrix.