Optimizing singular Rayleigh quotient subject to linear constraint

538 Views Asked by At

I want to numerically solve $$\min_x \frac{x^TAx}{x^TBx} \quad \mathrm{s.t.}\quad Cx=0,$$ where $A$ and $B$ are large sparse matrices, $A$ is positive semi-definite, $B$ is positive-definite, and $C$ is sparse.

One approach would be to find a basis $N$ for the nullspace of $C$, write $x=Ny$, and then reduce the above to an unconstrained eigenvector problem. But if I try to compute $N$ (using e.g. a QR decomposition of $C$) it ends up intractably dense.

So instead I want to incorporate constraint projection into the power iteration for finding $x$: more specifically, I start with a random vector $x^0$, then iterate:

  1. Solve $A\tilde x^{i+1} = Bx^i$ for $\tilde x^{i+1}$.
  2. Solve $CC^T\lambda = C\tilde{x}^{i+1}$ for $\lambda$.
  3. Compute $x^{i+1} = \frac{\tilde{x}^{i+1}-C^T\lambda}{\|\tilde{x}^{i+1}-C^T\lambda\|_B}.$

However I'm running into a couple of problems: A) The matrix $A$ is singular, so the linear system in step (1) does not have a solution; B) The matrix $CC^T$ may be singular.

Concern B is less serious than concern A since the linear system in step (2) is guaranteed to always have at least one solution. So I can solve the linear system in step (2) using a robust method for underdetermined least squares, e.g. QR decomposition. What should I do to fix step (1), though? Of course, there is a strong temptation to "fix" $A$ by adding a small multiple of the identity; is this fix mathematically sound?

2

There are 2 best solutions below

0
On BEST ANSWER

Answering my own question as I've found a rather satisfactory solution to this problem.

From the KKT conditions of the original problem one sees that the minimizer is the $x$-part of the eigenvector with smallest eigenvalue in the generalized eigenvalue problem $$\begin{bmatrix} A & C^T \\ C & 0\end{bmatrix}\begin{bmatrix}x\\\mu\end{bmatrix} = \lambda \begin{bmatrix} B & 0\\0 & 0\end{bmatrix} \begin{bmatrix} x \\ \mu \end{bmatrix}.$$ It follows that the following power iteration method will find $x$:

  1. Initialize $x^0$ to a random vector.
  2. Until converge, repeat:
    • Compute $y^i = Bx^i$
    • Solve $$\begin{bmatrix} A & C^T \\ C & 0\end{bmatrix}\begin{bmatrix}z^i\\\mu^i\end{bmatrix} = \begin{bmatrix}y^i\\0\end{bmatrix}.$$ In the case that $A$ and $CA^{-1}C^T$ are invertible, the solution has expression $$z^i = A^{-1}y^i - A^{-1}C^T(CA^{-1}C^T)^{-1}CA^{-1}y^i.$$
    • Set $x^{i+1} = z^i/\sqrt{(z^i)^TBz^i}.$
0
On

According to the results by Jagannathan, Dinkelbach and Ródenas, $$\min_{x\neq 0}\left\{ \frac{f(x)}{g(x)} : Cx=0 \right\} \leq \lambda$$ if and only if $$\min_{x\neq 0} \left\{ f(x) - \lambda g(x) : Cx = 0 \right\} \leq 0,$$ with $f(x) - \lambda g(x)=x^T(A-\lambda B)x.$ This is still not a simple problem, but you can now do bisection search on $\lambda$.