What more can be said about $\max_{v^\mathsf{T} v=1} \frac{v^\mathsf{T} B v}{v^\mathsf{T} A v}$?

153 Views Asked by At

Assume we have a positive semidefinite matrix $A$. Another matrix $B$ is equal to $A$ except it's $i$th row and$i$th column is zeros and element $B_{ii}=(n-1)A_{ii}$. i.e. \begin{align} B&=A-e_i e_i^T A- A e_i e_i^T+(n-1) A_{ii} e_i e_i^T\cr \end{align}

So, $B$ is also positive semidefinite. Now, my question is what can we say about the following quantity? \begin{align} r=\max_{v^\mathsf{T} v=1} \frac{v^\mathsf{T} B v}{v^\mathsf{T} A v} \leq \frac{\lambda_{max}\big(B\big)}{\lambda_{min}\big(A\big)} \end{align} Edit: If it's the case that we encounter the division zero problem, we can assume that matrices are positive definite. I mean I need ( and must ) use information available about $B$ to compute this quantity. What I have done: Based on the comment by @Michael and answer by @user7530 , the above quantity is equal to $\lambda_{max}(A^{-1}B)$. So we have
\begin{align} A^{-1}B&=I-A^{-1}e_i e_i^T A- A^{-1} A e_i e_i^T+(n-1) A_{ii} A^{-1} e_i e_i^T\cr r=\lambda_{max}(A^{-1}B)&=\lambda_{max}\Big(I-(A^{-1})_{*i}A_{i*}- e_i e_i^T+(n-1) A_{ii} (A^{-1})_{ii} e_i e_i^T\Big)\cr r=\lambda_{max}(A^{-1}B)&=1-\lambda_{max}\Big((A^{-1})_{*i}A_{i*}+ e_i e_i^T-(n-1) A_{ii} (A^{-1})_{ii} e_i e_i^T\Big) \end{align} I don't know how to proceed. Even if we can find a tight bound is very good for me. I want to use this in an optimization problem.

1

There are 1 best solutions below

3
On

The key is to see that whenever $v$ is an eigenvector of $A^{-1}B$, $\frac{v^TBv}{v^TAv}$ is the corresponding eigenvalue. This follows from \begin{align*} A^{-1}Bv &= \lambda v\\ Bv &= \lambda Av\\\ v^TBv &= \lambda v^TAv. \end{align*}

One important area where this identity shows up is the differential geometry of surfaces, where it proves that the principal curvatures are the eigenvalues of the Weingarten map.

Now the optimality condition of your maximization problem (notice that you are free to ignore the constraint, away from $\|v\|=0$) is

$$\frac{2Bv}{v^TAv} - \frac{2Avv^TBv}{(v^TAv)^2}=0$$ or $$A^{-1}Bv = \left(\frac{v^TBv}{v^TAv}\right)v,$$ so an optimal $v$ is necessarily an eigenvector of $A^{-1}B$, and we saw above that all eigenvectors will satisfy this equation. Therefore the solution to your optimization problem is the largest eigenvalue of $A^{-1}B$ (or equivalently, the largest generalized eigenvalue $Bv = \lambda Av$).