Positive subspace for symmetric matrix

530 Views Asked by At

Let $A$ be a symmetric, real matrix of signature $(r,s)$; so there are $r$ positive and $s$ negative eigenvalues. Let $u_1,...,u_r$ be eigenvectors for the positive eigenvalues.

I want to compute the "maximal positive definite" subspace $\mathrm{span}(u_1,...,u_r)$; in particular, given a vector $v \in \mathbb{R}^{r+s}$ I want to decide whether it lies in this subspace. Can this be done without computing the eigenvectors of $A$?

1

There are 1 best solutions below

0
On BEST ANSWER

Here I propose an iterative method for general diagonalizable matrices $A$. First, we need a matrix function $f(A)$ that maps all eigenvalues $\lambda$ of $A$ with $\mathrm{Re}\,\lambda>0$ to points inside the unit circle, i.e., $|f(\lambda)|<1$ and those with $\mathrm{Re}\,\lambda<0$ to points outside the unit circle, i.e., $|f(\lambda)|>1$. Then one can apply $[f(A)]^n$ onto $v$ to see if the norm of

$$v_n=[f(A)]^{n\,} v.$$

goes to zero. If so, all eigenvector components of $v$ must have $\mathrm{Re}\,\lambda>0$. A matrix exponential $f(A)=\exp(-A)$ can do the job, but calculating it using the Taylor expansion of $\exp(\cdot)$ or the eigenvalue expansion of $A$ would be expensive. A nice class of functions to consider would be the fractional-linear functions. Here we consider

$$f(A)=\frac{A-z\,I}{A+z^*I},$$

with $z$ being a complex constant with $\mathrm{Re}\,z>0$ and $-z^*$ not an eigenvalue of $A$ (to make the denominator invertible) and $I$ being the identity matrix. One can see that

$$|f(\lambda)|=\frac{|\lambda-z|}{|\lambda+z^*|},$$

so if $\lambda$ is closer to $z$ than to $-z^*$ on the complex plane, we have $|f(\lambda)|<1$, and otherwise we have $|f(\lambda)|>1$. Since the perpendicular bisector of $z$ and $-z^*$ is always the imaginary axis, and we have required $\mathrm{Re}\,z>0$, the condition to have $|f(\lambda)|<1$ is equivalent to $\mathrm{Re}\,\lambda>0$, which is exactly what we wanted.

This implicit iterative method generalizes to any linearly separable eigenvalue sifting problems (not necessarily based on whether $\mathrm{Re}\,\lambda>0$), which can be useful for big sparse matrices $A$. For small dense matrices, finding all of its eigenvalues and eigenvectors should be more efficient.