Eigen values of matrices when adding an extra term

301 Views Asked by At

I have matrix $A$ with eigenvalues $\lambda_i$ then i would add another matrix to it as : $A+ xx^T$ where $x$ is a column vector.

and also $ A= yy^T$ before adding the above term and $y$ was a column vector too, so $A$ is symmetric. Also $\|xx^T\|$ can be bigger or smaller than $\|yy^T\|$.

Is there any way to calculate new eigenvalues of $(xx^T+yy^T)$ using only the information from vector $x$ and the eigenvalues of $yy^T$ in order to do it faster?

5

There are 5 best solutions below

4
On BEST ANSWER

Given the vectors $x,y\in {\mathbb R}^n$ and the matrix $$A=xx^T+yy^T$$ its eigenvectors and eigenvalues can be calculated from first principles.


Let's look for a vector of the form $z=(y+\beta x)$ which satisfies the EV equation $$\eqalign{ Az &= \lambda z \cr (xx^T+yy^T)\,(y+\beta x) &= \lambda y+\lambda\beta x \cr xx^Ty+\beta xx^Tx+yy^Ty+\beta yy^Tx &= \lambda y+\lambda\beta x \cr\cr }$$ Collecting coefficients on $x$ and on $y$ yields two equations for $\lambda$ $$\eqalign{ x^Tx + \frac{1}{\beta}x^Ty &= \lambda \cr y^Ty+\beta\,y^Tx &= \lambda \cr }$$ In the event that $x$ and $y$ are orthogonal, you can stop here.

The eigenvectors and eigenvalues are $$\eqalign{ z_{1,2} &= \{x, \,\,y\} \cr \lambda_{1,2} &= \{x^Tx, \,\,y^Ty\} \cr\cr } $$

Otherwise, equating the two expressions for $\lambda$ leads to a quadratic equation in $\beta$ $$\eqalign{ y^Ty+\beta\,x^Ty &= x^Tx + \frac{1}{\beta}x^Ty \cr \beta\,y^Ty+\beta^2\,x^Ty &= \beta\,x^Tx + x^Ty \cr \beta^2\,(x^Ty) + \beta\,(y^Ty-x^Tx) - (x^Ty) &= 0 \cr\cr }$$ whose solution is $$\eqalign{ \beta_\pm &= \frac{(x^Tx-y^Ty) \pm\sqrt{(x^Tx-y^Ty)^2 + 4(x^Ty)^2}}{2(x^Ty)} \cr &= r \pm\sqrt{r^2+1} \cr }$$ where $$r=\frac{x^Tx-y^Ty}{2x^Ty}$$

Knowing the values $\beta_\pm$ you know the corresponding eigenvalues $$\lambda_\pm = y^Ty+\beta_\pm y^Tx$$ and eigenvectors $$z_\pm=y+\beta_\pm x$$


Update

Here is some Julia code to validate these formulas.

/usr/bin/env julia

# generate vectors and matrix
n=4; 
x = randn(n,1); y = randn(n,1); 
A = x*x' + y*y'; 
r = (x'*x - y'*y) / (2*x'*y);

# beta plus EV
b = r + sqrt(r*r+1); y'*y + b*x'*y
1x1 Array{Float64,2}:
 3.93903

# beta minus EV
b = r - sqrt(r*r+1); y'*y + b*x'*y
1x1 Array{Float64,2}:
 7.9371

# eigenvalues
eigfact(A)[:values]'
1x4 Array{Float64,2}:
 -2.22045e-15  5.01652e-16  3.93903  7.9371
6
On

No. For instance take $$ y=\begin{bmatrix}1\\0\end{bmatrix},\ \ \ \ \ x=\begin{bmatrix}1\\0\end{bmatrix}.$$ Then $$ A=yy^T=\begin{bmatrix}1&0\\0&0\end{bmatrix},\ \ \ \ \ \ A+xx^T=\begin{bmatrix}2&0\\0&0\end{bmatrix}. $$ If we take the same $x$ but $y=\begin{bmatrix}0\\1\end{bmatrix}$, now $$ A=yy^T=\begin{bmatrix}0&0\\0&1\end{bmatrix},\ \ \ \ \ \ A+xx^T=\begin{bmatrix}1&0\\0&1\end{bmatrix}. $$ In the first case the eigenvalues of $A+xx^T$ are $2,0$ while in the second one they are $1,1$ (with same $x$, and the same eigenvalues of $yy^T$).

1
On

Since $A+xx^{T}$ is only a rank 2 matrix, you can relatively quickly find its two non-zero eigenvalues by various iterative methods. The matrix multiplications in the iterative method can be done very quickly because of the special rank two structure.

0
On

In general, you can not do so.. This is problem-specific.. In other terms, if $\pmb{x}$ and $\pmb{y}$ are linearly dependent ($\pmb{y}$ = $\alpha$$\pmb{x}$, for a non-zero $\alpha$), then the matrix $\pmb{B} = \pmb{x}\pmb{x}^T + \pmb{y}\pmb{y}^T$ is written as \begin{equation}\label{eq24} \begin{split} \pmb{B} & = \pmb{x}\pmb{x}^T + \pmb{y}\pmb{y}^T \\&= \pmb{x}\pmb{x}^T + \alpha^2\pmb{x}\pmb{x}^T \\&= (1+\alpha^2)\pmb{x}\pmb{x}^T \end{split} \end{equation} which is rank-1, i.e. the eigenvalue you are asking for is 0.

P.S: @Martin Argerami gives you a particular case of his first counter-example of what i'm saying ($\alpha = 1$).

0
On

First point. Let $A,B\in M_n(\mathbb{R})$ be symmetric matrices with known spectra. There is not any EQUALITIES linking $spectrum(A),spectrum(B),spectrum(A+B)$; there exist only INEQUALITIES (cf. Horn's conjecture).

Second point. Solving your problem is easy. As Brian wrote, $A=xx^T+yy^T$ has only (at most) two non-zero eigenvalues $\lambda,\mu$ with associated othogonal eigenvectors $u,v$. Moreover, if $rank(A)=2$, then $A$ is diagonalizable over $\mathbb{R}$ (there is $P\in O(n)$ s.t. $A=Pdiag(0_{n-2},\lambda,\mu)P^{-1}$ where $\lambda,\mu>0$). Let $z$ be a random vector: $z=z_1+au+bv$, where $z_1\in\ker(A),a\not= 0,b\not= 0$; thus $Az=a\lambda u+b\mu v$ and $A^2z=a\lambda^2u+b\mu^2v$.

Case 1. If $Az,A^2z$ are not linearly independent, then $\lambda=\mu=1/2trace(A)=1/2(||x||^2+||y||^2)$.

Case 2. Otherwise, let $\Pi$ be the $A$-invariant plane spanned by $Az,A^2z$; to obtain $\lambda,\mu$ it suffices to write the matrix of $A_{|\Pi}$ in the basis $Az,A^2z$, that is $\begin{pmatrix}0&r\\1&tr(A)\end{pmatrix}$, where $A^3z=rAz+tr(A)A^2z$ (thus $r=\dfrac{[A^3z-tr(A)A^2z]_1}{[Az]_1}$). Finally $\lambda,\mu=\dfrac{tr(A)\pm \sqrt{tr(A)^2+4r}}{2}$.

The complexity of the calculation of $(\lambda,\mu)$ is the complexity of the calculations of $Az,A^2z,A^3z$, that is $O(n^2)$.