Eigenvector of the sum of two skew symmetric matrices

203 Views Asked by At

I am trying to understand how a perturbation of a skew symmetric matrix by another skew symmetric matrix affects the dominant eigenvector corresponding to $\lambda=0$.

Specifically, let $S$ be an $n \times n$ real-valued skew symmetric matrix , with $n$ being odd. Assume $S$ has eigenvectors $(e_1,...,e_n)$ and corresponding eigenvalues $(\lambda_1,...,\lambda_n)$, sorted such that $e_1=0$ (which is guaranteed because $n$ is odd). Thus, $e_1$ is the fixed point solution of $S$.

Let $M$ be another skew symmetric matrix of size $n\times n$, with eigenvalues $(b_1,...,b_n)$ and eigenvalues $(\theta_1,...,\theta_n)$.

For some small $0<\epsilon\ll1$, define:

$B=S+\epsilon M$

Then $B$ is likewise skew symmetric, and can be thought of as the perturbation of $S$ by the matrix $M$. Can we approximate or calculate the leading eigenvector of $B$, corresponding to $\lambda=0$, using the eigenvalue distribution of $B$ and $M$? In other words, can we approximate how much the fixed point of $S$ changes when perturbed by $M$?

2

There are 2 best solutions below

3
On

The physicist's approach to such problems is to do first-order perturbation theory (below). I am not terribly confident in the applicability of this method because of the facts that (1) the matrices are skew-symmetric and therefore must have pairs of eigenvalues of equal magnitude and (2) $S$ is not invertible by assumption. However, this answer may still be helpful.

The typical argument goes like this. For parallel notation, I'll use $\Delta S$ instead of $M$, and I'll call the eigenvector in question $v$ instead of $e_1$. Then we know $S v = 0$ and we want to find the solution to the new eigenvalue equation

$$ (S + \Delta S)(v + \Delta v) = \Delta \lambda (v + \Delta v) $$

All terms that are first-order perturbations have a $\Delta$. Now if we expand, use $Sv=0$, and retain only terms with a single $\Delta$, we get

$$ S \Delta v + \Delta S v = \Delta \lambda v $$

The next step is to multiply through by $v^T$, since we know by the fact that $S$ is skew-symmetric that $v^T S$ is also $0$. Then we get

$$ v^T \Delta S v = \Delta \lambda v^T v $$ or $$ \Delta \lambda = \frac{v^T \Delta S v}{v^T v} = 0 $$

where the last equality follows from $\Delta S$ being skew-symmetric.

At this point, the original equation has become

$$ (S + \Delta S)(v + \Delta v) = 0 $$

If we also assume that $\lambda=0$ has multiplicity 1 in $S+\Delta S$, this implies that $\Delta v$ is parallel to $v$. Otherwise, following a suggestion from the comments and from another answer, we could apply $Sv=0$ and then try to use the Roger-Penrose pseudo-inverse as an inverse for $(S+\Delta S)$. This would give

$$ \Delta v = -v - (S + \Delta S)^+ \Delta S v $$

However, I do not personally know of any theorem that would guarantee that this gives a reasonable result. I suspect that because the eigenvalue is zero, there may be problems with applying perturbation theory here in general.

3
On

Suppose $\ker(S)$ is spanned by a single vector $\bf u$ (in particular $n$ must be odd). Let $S^+$ be the Moore-Penrose pseudo-inverse of $S$.

Now it will be more convenient to write the perturbed matrix as $S + \epsilon M$ where $\epsilon$ is a parameter. Then it turns out that the following series (convergent for sufficiently small $\epsilon$) gives us a vector in $\ker(S+\epsilon M)$:

$$ \sum_{k=0}^\infty \epsilon^k (-S^+ M)^k {\bf u} $$