How to find an eigenvalue decomposition of a symmetric matrix $\mathbf{A} = \mathbf{\alpha}\mathbf{\beta}^T + \mathbf{\beta}\mathbf{\alpha}^T$

249 Views Asked by At

Let $\boldsymbol{\alpha}, \boldsymbol{\beta} \in \mathbb{R}^{n}$ be two linearly independent vectors, with unit norm ($\Vert \boldsymbol{\alpha} \Vert _{2} =\Vert \boldsymbol{\beta} \Vert _{2} =1$). Define the symmetric matrix $\mathbf{A} =\boldsymbol{\alpha}\boldsymbol{\beta}^{T} +\boldsymbol{\beta}\boldsymbol{\alpha}^{T}$.
I know $\boldsymbol{\alpha}+\boldsymbol{\beta}$ and $\boldsymbol{\alpha}-\boldsymbol{\beta}$ are two eigenvector of $\mathbf A$, and I know the dimension of the nullspace of $\mathbf A$ is $n - 2$.
But I do not know how to find the nullspace of $\mathbf A$ and how to find an eigenvalue decomposition of $\mathbf A$ in terms of $\mathbf \alpha $ and $\mathbf \beta$ ?

1

There are 1 best solutions below

0
On BEST ANSWER

Because you already know that $\boldsymbol v_\pm = \boldsymbol\alpha \pm \boldsymbol\beta$ (that is, $\boldsymbol v_+$ and $\boldsymbol v_-$) are (linearly independent) eigenvectors of $\boldsymbol A$, there isn't much more work to do here. First, verify that the eigenvectors $\boldsymbol v_\pm$ satisfy $\boldsymbol A \boldsymbol v_\pm = \lambda_\pm \boldsymbol v_\pm$, where $\lambda_\pm = 1\pm \boldsymbol \beta^T\boldsymbol \alpha$ (that is, we have two eigenvalues $\lambda_+ = 1 + \boldsymbol \beta^T\boldsymbol\alpha$ and $\lambda_- = 1 - \boldsymbol\beta ^T\boldsymbol\alpha$).

Now, let $\mathcal N = \{\boldsymbol \alpha, \boldsymbol \beta\}^\perp$ (the orthogonal complement to the span of $\boldsymbol \alpha$ and $\boldsymbol \beta$). Note that $\mathcal N$ is a space of dimension $n-2$. Verify that for any $\boldsymbol v \in \mathcal N$, we have $\boldsymbol A \boldsymbol v = 0$. Thus, $\mathcal N$ is a subspace of the nullspace of $\boldsymbol A$, and because of its dimension you can deduce that $\mathcal N$ is the entire nullspace of $\boldsymbol A$.

Putting all that together, you know that the eigenvalues of $\boldsymbol A$ are $\lambda_+,\lambda_-,$ and $0$ with multiplicity $n-2$.

For the eigenvalue decomposition of $\boldsymbol A$, let $\boldsymbol u_\pm$ denote the unit vectors in the direction of $\boldsymbol v_\pm$. You can either verify directly or deduce as a consequence of the symmetry of $\boldsymbol A$ that the vectors $\boldsymbol v_\pm$ and hence the vectors $\boldsymbol u_\pm$ are mutually orthogonal. From there, we have $$ \boldsymbol A = \lambda_- \boldsymbol u_- \boldsymbol u_-^T + \lambda_+ \boldsymbol u_+ \boldsymbol u_+^T. $$

As it turns out, this can be rewritten in the form $$ \boldsymbol A = \frac 12 \left[(\boldsymbol \alpha + \boldsymbol \beta)(\boldsymbol \alpha + \boldsymbol \beta)^T - (\boldsymbol \alpha - \boldsymbol \beta)(\boldsymbol \alpha - \boldsymbol \beta)^T\right], $$ where $(\boldsymbol \alpha + \boldsymbol \beta)(\boldsymbol \alpha + \boldsymbol \beta)^T$ corresponds to the "$+$" term and $(\boldsymbol \alpha - \boldsymbol \beta)(\boldsymbol \alpha - \boldsymbol \beta)^T$ to the "$-$" term.


Suppose that we didn't come in with knowledge of the forms of the eigenvectors $\boldsymbol v_\pm$. We could do the following. Note that $\boldsymbol A$ can be written as $\boldsymbol A = \boldsymbol B \boldsymbol C$, where $$ \boldsymbol B = \pmatrix{\boldsymbol \alpha & \boldsymbol \beta}, \quad \boldsymbol C = \pmatrix{\boldsymbol \beta & \boldsymbol \alpha}^T. $$ The matrices $\boldsymbol B \boldsymbol C$ and $\boldsymbol C$ and $\boldsymbol B$ must have the same non-zero eigenvalues. Thus, the non-zero eigenvalues of $\boldsymbol A$ must be equal to the non-zero eigenvalues of $$ \boldsymbol C \boldsymbol B = \pmatrix{\boldsymbol \beta^T \boldsymbol\alpha & \boldsymbol\beta^T \boldsymbol \beta\\ \boldsymbol\alpha^T \boldsymbol\alpha & \boldsymbol \alpha^T \boldsymbol\beta} = \pmatrix{ \boldsymbol\alpha^T \boldsymbol \beta & 1 \\ 1 & \boldsymbol \alpha^T \boldsymbol\beta}. $$ It is easy to show that the eigenvalues of this matrix are $1 \pm \boldsymbol \alpha ^T\boldsymbol \beta$.