If $A + \Delta A$ has eigenvalue $\lambda$, show that $A$ has eigenvalue $\lambda + \Delta \lambda$.

295 Views Asked by At

$A$ is symmetric and can be diagonalised as $A = V\Lambda V^{-1}$, where $\Lambda$ is a diagonal matrix with eigenvalues of $A$ and $V$ is an orthogonal matrix with column vectors equal to eigenvectors of $A$.

If $A + \Delta A$ has eigenvalue $\lambda$, show that $A$ has eigenvalue $\lambda + \Delta \lambda$ with

$$\Delta \lambda = 0 \qquad\text{or}\qquad |\Delta \lambda| = ||(\lambda I - \Lambda)^{-1}||_2^{-1}.$$

My attempt:

$$A u = (\lambda + \Delta \lambda)u$$ $$A u = \lambda u + \Delta \lambda u$$ $$- \Delta \lambda u = (\lambda I - A)u$$ $$-(\lambda I - A)^{-1} \Delta \lambda u = u$$ $$||u||_2 = ||(\lambda I - A)^{-1} \Delta \lambda u||_2$$

Where do I go from here?

1

There are 1 best solutions below

0
On

The key to this problem, I think, is to realize that $A + \Delta A$ doesn't tell us anything about $A$.


By replacing $A$ with $(A - \lambda I)$, suppose without loss of generality that $\lambda = 0$.

We are given that $A + \Delta A$ has eigenvalue $0$, and we want to show that it must either hold that $A$ has eigenvalue $0$ or $A$ has eigenvalue $\Delta \lambda$ with $|\Delta \lambda| = \|\Lambda^{-1}\|^{-1}$. Equivalently, if $A$ does not have $0$ as an eigenvalue (i.e. $A$ is invertible), then $A$ has an eigenvalue $\Delta \lambda$ with $|\Delta \lambda| = \|\Lambda^{-1}\|^{-1}$.

This is easy to see: suppose that $A$ is invertible. We write $$ \Lambda = \pmatrix{\lambda_1 \\ & \ddots \\ && \lambda_n}, $$ where each $\lambda_i$ is an eigenvalue of $A$. We note that $$ \Lambda^{-1} = \pmatrix{\lambda_1^{-1}\\ & \ddots \\ && \lambda_n^{-1}}, $$ from which it follows that $$ \|\Lambda^{-1}\|_2 = \max_{j=1,\dots,n} |\lambda_j|^{-1} = (\min_{j=1,\dots,n} |\lambda_j|)^{-1}. $$ Thus, we have $\|\Lambda^{-1}\|_2^{-1} = \min_{j=1,\dots,n}|\lambda_j|$. So, it is indeed the case that $A$ has an eigenvalue $\Delta \lambda = \lambda_j$ for which $|\lambda_j| = \|\Lambda^{-1}\|^{-1}$, which is what we wanted to show.


A less "matrix-dependent" approach: again, suppose WLOG that $\lambda = 0$, and suppose that $A$ is invertible. We note that for $z \in \Bbb C$, $$ A - zI \text{ is invertible }\iff I - z A^{-1} \text{ is invertible }. $$ For $z$ with $|z|$ sufficiently small, we see that $I - zA^{-1}$ must be invertible because the inverse can be expressed via the convergent Neumann series $$ (I - zA^{-1}) = \sum_{k=0}^\infty A^{-k} z^k. $$ The radius of convergence of this series is equal to $\rho(A^{-1})^{-1}$, which is equal to $\|\Lambda^{-1}\|_2^{-1}$ (since $A$ is real and symmetric, this is in turn equal to $\|A^{-1}\|_2^{-1}$). By the analyticity of holomorphic functions, there exists a $z \in \Bbb C$ with $|z| = \|\Lambda^{-1}\|^{-1}$ at which the function $(I - zA^{-1})^{-1} = A^{-1}(A - zI)^{-1}$ has a singularity, which is to say that $A - zI$ fails to be invertible. In other words, such a $z$ must be an eigenvalue of $A$.