Compute an upper bound on generalized eigenvalues (by using the coefficients)

828 Views Asked by At

Consider the generalized, symmetric eigenvalueproblem: \begin{equation} A x = \lambda B x, \end{equation} with $A, B$ symmetric and $B$ being positive definite.

For some computations, i was trying to figure out an upper bound for the generalized eigenvalues of this problem by using the coefficients. By taking a proper matrix norm $\| \cdot\|$, you get \begin{equation} |\lambda| = \frac{\|Ax\|}{\|Bx\|} \leq \frac{\|A\| \|x\|}{\|Bx\|}. \end{equation} The enumerator can be estimated simple, for example by taking a normed eigenvector $x$ with $\| \cdot \|_{\infty}$ (the sup-norm, i.e. you just have to add some absolute values of the matrix entries to bound that part). But what should i do with the denominator?

Edit: as mentioned by Algebraic Pavel's or user161825's answer, you could do that by estimating the smallest eigenvalue $\mu_1$ of $B$ by some $0 < \bar{\mu} < \mu_{1}$, but how do you get this $\bar{\mu}$ just by knowing the coefficients of $B$? Of course - for computations - trying to invert the Matrix $B$ isn't feasible, are there smart ways to get this bound?

One possible answer could be using the neat Gershgorin-circles, but this can only be applied to special cases of $B$, namely when $B$ is strict diagonal dominant. Then you could iterate over all rows $k$ and, depending on the sign of the diagonal entry $B_{k,k}$ (let's here assume it's positive), you can calculate $\displaystyle \xi_k = B_{k,k} - \sum_{\substack{i=1, \\ i \neq k}}^{n} | B_{k,i} | $. In the end you just take the minimal $\xi_k$ over $1 \leq k \leq n$ and got your lower estimate for the denominator.

2

There are 2 best solutions below

5
On

Let $b>0$ be the smallest eigenvalue of $B$. Then $b^2>0$ is the smallest eigenvalue of $B^2$. It follows that $\langle (B^2-b^2I)x,x\rangle\geq 0$, or equivalently $\|B x\|\geq b\|x\|$.

Alternatively, rewrite your equation as $$ B^{-1}Ax=\lambda x, $$ which implies $|\lambda|\leq \|B^{-1}A\|$. Of course $b=\|B^{-1}\|^{-1}$, so this result implies $|\lambda|\leq \|A\|/b$, as above.

As for computing a lower bound for the lowest eigenvalue, this may be an interesting read. Inspired by formula 2.54 there, let us assume that $b_1\geq\ldots\geq b_n$ are the non-increasingly ordered eigenvalues of $B$, counted with multiplicity. Observe that $$ \prod_{j=1}^{n-1} b_j\leq \frac{\sum_{j=1}^{n-1} b_j^{n-1}}{n-1}<\frac{\mbox{tr }(B^{n-1})}{n-1}, $$ so that $$ b_n=\frac{\mbox{det }B}{\prod_{j=1}^{n-1} b_j}> \frac{(n-1)\mbox{det }B}{\mbox{tr }(B^{n-1})}>0 $$ when $n>1$.

This leaves us with the following (probably poor) estimate $$ |\lambda|< \frac{\|A\|\mbox{tr }(B^{n-1})}{(n-1)\mbox{det }B}. $$

3
On

The problem $Ax=\lambda Bx$ is equivalent to $B^{-1}Ax=\lambda x$, so the generalized eigenvalues are the eigenvalues of $B^{-1}A\sim B^{-1/2}AB^{-1/2}$. Hence the eigenvalues can be bounded from above by any norm of $B^{-1}A$, e.g., $$ |\lambda|\leq\|B^{-1}A\|_2,\quad|\lambda|\leq\|B^{-1/2}AB^{-1/2}\|_2,\quad \text{etc.} $$ Note that the latter inequality is sharper since $\|B^{-1}A\|_2\geq\|B^{-1/2}AB^{-1/2}\|_2$ (still assuming that $B$ is SPD). Of course, if the chosen matrix norm is submultiplicative, you can take out $B$ and get $$ |\lambda|\leq\|A\|\|B^{-1}\|. $$

Another simple bound can be obtained as follows: $$Ax=\lambda Ax\quad\Rightarrow\quad x^TAx=\lambda x^TBx\quad\Rightarrow\quad\lambda=\frac{x^TAx}{x^TBx}.$$ Hence $$ |\lambda|=\frac{|x^TAx|}{x^TBx}=\left|\frac{x^TAx}{x^Tx}\right|\frac{x^Tx}{x^TBx} \leq\max_{y\neq 0}\left|\frac{y^TAy}{y^Ty}\right|\max_{z\neq 0}\frac{z^Tz}{z^TBz} =\frac{\max_{y\neq 0}\left|\frac{y^TAy}{y^Ty}\right|}{\min_{z\neq 0}\frac{z^TBz}{z^Tz}} =\frac{|\lambda_{\max}(A)|}{\lambda_{\min}(B)}. $$ Note that since the matrices $A$ and $B$ are symmetric and $B$ is SPD, this bound is equal again to $\|A\|_2\|B^{-1}\|_2$.