Is Gershgorin bound of roots sharp?

285 Views Asked by At

Gershgorin circle theorem tells that the eigenvalues of a matrix $A$ lie in the union of the associated Gershgorin circles.

$A=\begin{pmatrix} 0 & 0 & \dots & 0 & -a_0 \\ 1 & 0 & \dots & 0 & -a_1 \\ 0 & 1 & \dots & 0 & -a_2 \\ \vdots & \vdots & \vdots & \vdots & \vdots \\ 0 & 0 & \dots & 1 & -a_{d-1} \\ \end{pmatrix}$

has characteristic polynomial $(-1)^d(X^d+a_{d-1}X^{d-1}+\ldots+a_0)$.

Thus, the roots of $X^d+a_{d-1}X^{d-1}+\ldots+a_0$ lie in $\displaystyle\bigcup_{i=1}^{d-2}\bar{D}(0,1+|a_i|)\cup\bar{D}(-a_{n-1},1)$, which of course can be rewritten as $\displaystyle\bar{D}(0,1+\max_{1\leq i\leq d-2}|a_i|)\cup\bar{D}(-a_{n-1},1)$

Is this bound sharp ?

1

There are 1 best solutions below

0
On BEST ANSWER

The two bounds $$ R_1=1+\max_{k=0,...,n-1}|a_k|\text{ and }R_2=\max(1,|a_0|+|a_1|+...+|a_{n-1}|) $$ are fast to compute but not very sharp. You get better results by computing the unique positive root of the convex and monotone function $$ f(R)=R^n-|a_{n-1}|·R^{n-1}-…-|a_1|·R-|a_0|. $$ Both $f(R_1)\ge 0$ and $f(R_2)\ge0$ show that the root will be smaller. A sufficient approximation is computed with a few rounds of Newtons method.


Even stricter bounds are obtained from Graeffe iterates of the original polynomial $p(x)=x^n+a_{n-1}x^{n-1}+…+a_0$, which are $$ p_{(1)}(x^2)=p(x)·p(-x),\quad p_{(k+1)}(x^2)=p_{(k)}(x)·p_{(-k)}(x). $$ For a numerical application of these principles see for instance J.-C. YAKOUBSOHN: Numerical analysis of a bisection-exclusion method to find zeros of univariate analytic functions