Bounds on the diagonal elements of inverse of negative definite matrix

391 Views Asked by At

Let $X$ be an $n\times n$ negative definite matrix such that $X_{ii}<0$ for all $i$ and $X_{ij}\geq 0$ for $i\neq j$. Denote by $Y=X^{-1}$ the inverse of $X$. I would like to find bounds on the diagonal elements of $Y$. Can we find $\underline{Y}$ and $\overline{Y}$ such that $\underline{Y}\leq Y_{ii} \leq \overline{Y}$ for all $i$?

Note that $-X$ satisfies the definition of an $M$-matrix and so its inverse is such that all its elements are nonnegative. This implies that all the elements of $Y$ are nonpositive and so one potential upper bound is $\overline{Y}=0$ but I would like to find a tighter bound if possible.

Also, in the special in which $X$ is diagonal, it is easy to show that $$ \left(\max_i X_{ii}\right)^{-1}\leq Y_{ii} \leq \left(\min_i X_{ii}\right)^{-1}. $$ I am hoping that a result like this extends to nondiagonal $X$'s.

2

There are 2 best solutions below

2
On BEST ANSWER

You can use the Rayleigh quotient. For any hermitian matrix $A$ and nonzero vector $x$, $$\lambda_{\min}(A) \leq \frac{x^TAx}{x^Tx} \leq \lambda_{\max}(A).$$ By taking $x=e_i$, the $i^{th}$ standard basis vector (with a $1$ at position $i$ and zeros everywhere else), you obtain: $$\lambda_{\min}(X^{-1}) \leq (X^{-1})_{ii} \leq \lambda_{\max}(X^{-1}).$$ You have $\lambda_{\min}(X^{-1}) = 1/\lambda_{\max}(X)$ and $\lambda_{\max}(X^{-1}) = 1/\lambda_{\min}(X)$ since all eigenvalues are negative, and since if $\lambda$ is an eigenvalue of $X$ then $1/\lambda$ is an eigenvalue of $X^{-1}$ (if $Xv = \lambda v$ then $X^{-1}v = (1/\lambda)v$).

0
On

To begin, I'll copy some ideas from my earlier post here.

Let $s$ be equal to the absolute value of the smallest (most negative) diagonal entry of $X$. We can rewrite $X$ as $X = A - sI$, where $A$ is a non-negative matrix.

By the Perron-Frobenius theorem, $A$ must have a positive eigenvalue equal to $\rho(A)$.
Furthermore, for any eigenvalue $\lambda$ of $A$, $\lambda - s$ is an eigenvalue of $X$. Thus, $\rho(A) - s$ is an eigenvalue of $A$. Since the eigenvalues of $X$ all have negative real part, we note that $$ \text{Re}(\rho(A) - s) > 0 \implies \rho(A) < s $$ So, we have $\rho(A) < s$. Denote $B = \frac 1s A$. We note that $\rho(B) = \rho(A)/s < 1$. Let $X' = \frac 1s X = B - I$. Since $\rho(B) < 1$, we can show that the infinite series $-\sum_{n \geq 0} B^n$ converges to $(B-I)^{-1}$. With that established, we have $$ X^{-1} = [sX']^{-1} = -s^{-1} \sum_{n=0}^\infty B^n = -s^{-1} \sum_{n=0}^\infty \left(I + \frac 1s X\right)^n. $$ From there, one could apply the ideas from this recent post (with the same asker as this one). In particular, let $v$ denote a Perron eigenvector of $B$ (i.e. $v > 0$ and $Bv = \rho(B)v$). Let $D = \operatorname{diag}(v)$. We see that $D^{-1}BD$ is a matrix with constant row-sum, and that $$ X^{-1} = -s^{-1} D\left[\sum_{n=0}^\infty (D^{-1}BD)^n\right]D^{-1}. $$