Partial converse to the diagonal dominance criterion

170 Views Asked by At

Diagonal dominance criterion. The tridiagonal symmetric matrix $$\tag{1} A=\begin{bmatrix} a_1 & b_1 & 0& 0& 0 & \ldots & 0 \\ b_1 & a_2 & b_2 & 0 & 0 & \ldots & 0 \\ 0 & b_2 & a_3 & b_3 &0 & \ldots & 0 \\ \ldots & \ldots &\ldots &\ldots &\ldots &\ldots &\ldots \\ 0& \ldots &\ldots &0 & b_{n-2} & a_{n-1} & b_{n-1} \\ 0& \ldots &\ldots &0 & 0 & b_{n-1} & a_n \end{bmatrix},\qquad a_j, b_j\ge 0$$ is positive semidefinite (i.e. $\mathbf{x}^TA\mathbf{x}\ge 0$ for all column vector $\mathbf x\in \mathbb{R}^n$) if $$ a_j\ge b_{j-1}+b_j\qquad \forall j=1\ldots n,$$ where $b_{0}=b_n=0$. One says that $A$ is diagonally dominant.

This criterion is not a necessary and sufficient condition, as the following matrix shows: $$A_2=\begin{bmatrix} a_1 & 1 \\ 1 & a_2 \end{bmatrix},\quad 0<a_1<1,\ a_2>1\,\ a_1a_2 >1. $$ This matrix is not diagonally dominant but it is positive semidefinite. Note, however, that it suffices to conjugate $A_2$ with a diagonal matrix to make it diagonally dominant: $$ \begin{bmatrix} \sqrt{a_2}& 0 \\ 0 &\frac{1}{\sqrt{a_2}}\end{bmatrix} A_2 \begin{bmatrix} \sqrt{a_2}& 0 \\ 0 &\frac{1}{\sqrt{a_2}}\end{bmatrix} = \begin{bmatrix} a_1a_2 & 1 \\ 1 & 1\end{bmatrix}.$$

Question. Let $A$ be a $n\times n$ tridiagonal symmetric matrix with nonnegative entries. Assume that $A$ is positive semidefinite. Does there exist a diagonal matrix $D=\mathrm{diag}(d_1\ldots d_n)$, with $d_j > 0$, such that $D^TAD$ is diagonally dominant?


I will write here a proof of the diagonal dominance criterion for convenience. Let $A$ be the matrix in (1). Then, assuming $b_0=b_n=0, x_0=x_{n+1}=0$, $$ \begin{split} \mathbf{x}^TA\mathbf{x}&= \sum_{j=1}^n a_j x_j^2 + 2b_jx_jx_{j+1}\\ &\ge \sum_{j=1}^n b_{j-1}x_j^2 + b_j x_j^2 +2b_jx_jx_{j+1} \\ &= \sum_{j=1}^n b_j(x_{j+1}+x_j)^2 \ge 0. \end{split} $$

1

There are 1 best solutions below

3
On BEST ANSWER

Remarks. i) the correct formula is $x^TAx\geq \sum_j b_j(x_{j+1}+x_j)^2$.

ii) You cannot ask for a matrix $D$ that only satisfies $D\geq 0$; indeed $D=0$ would be convenient! A better condition is $sign(D^TAD)=sign(A)$, that is, $rank(D^TAD)=rank(A)$.

Case 1. We assume $b_j>0$ (and not $b_j\geq 0$) and $A\geq 0$. The condition upon the $d_i$ are

$a_nd_n\geq b_{n-1}d_{n-1}\;,\;a_{n-1}d_{n-1}\geq b_{n-2}d_{n-2}+b_{n-1}d_{n},\cdots$

You can take $d_n=1,d_{n-1}=\dfrac{a_n}{b_{n-1}},d_{n-2}=\dfrac{\Delta_2}{b_{n-1}b_{n-2}},d_{n-3}=\dfrac{\Delta_3}{b_{n-1}b_{n-2}b_{n-3}},\cdots$

where $\Delta_k$ is the determinant constituted with the $k$ last columns and rows of $A$. Clearly $\Delta_k\geq 0$ and consequently, $d_{n-k}\geq 0$; moreover $\Delta_k=0$ iff $d_{n-k}=0$.

Of course, if $A>0$, then $D>0$, $D^TAD>0$ and we are done. Otherwise, I did not do the calculations.

EDIT 2.

Case 2. We assume $b_j\geq 0$ and $A\geq 0$. The zero $b_i$'s separate $ A $ into diagonal blocks $A=diag(A_1,\cdots,A_k)$. It suffices to apply Case 1 to each $A_i$.

Case 3. We assume $A\geq 0$. Then the matrix $C$ defined by $c_{i,j}=|a_{i,j}|$ has same spectrum as $A$ and is $\geq 0$; thus, as in the previous cases, we can associate to $C$ a matrix $D$. Then $D^TAD$ is diagonally dominant.