Showing that if the diagonals of a Hermitian matrix, $A$, equal $\lambda_{\max}$ or $\lambda_{\min}$, then $A$ is diagonal.

502 Views Asked by At

Suppose $A=(a_{ij})$ is $n\times n$ Hermitian. Show that for all $1\leq i \leq n$;

if $a_{ii} = \lambda_{\max}$ or $\lambda_{\min}$, then $A$ is diagonal.

I've already proven elsewhere that the diagonals of $A$ must lie between $\lambda_{\min}$ and $\lambda_{\max}$, and I'm pretty sure I want to combine this with the min-max theorem to get the desired result, but it's not clear to me how I can use these results to say something about the off-diagonals of $A$ to show they're zero.

Any thoughts would be greatly appreciated. Thanks in advance.

2

There are 2 best solutions below

0
On

proof 1
First consider
$B := A + \gamma I\succ 0$
for some large enough $\gamma \gt 0$

then if $A$ has its min and max eigenvalues on the diagonal, so does $B$. Of course $B$ and $A$ have the same off diagonal entries, so it suffices to prove that HPD $B$ must be diagonal.

suppose $r$ of the diagonal components of $B$ are equal to $\lambda_\text{max}$. And using permutation matrices to effect similarity transforms, we can assume WLOG that these are the first r diagonal components of $B$. Then with eigenvalues in the usual ordering, $\lambda_1 \geq \lambda_2 \geq... \geq \lambda_n$

for $m \in\{1,2,...,n-1,n\}$
$\sum_{k=1}^m b_{k,k} \geq \sum_{k=1}^m \lambda_k$
with equality for $m=n$ (check the trace)

so the diagonal of $B$ majorizes its eigenvalues. Using Schur concavity of the nth elementary symmetric polynomial we have

$\prod_{k=1}^n b_{k,k}\leq \prod_{k=1}^n \lambda_k = \det\big(B\big)\leq \prod_{k=1}^n b_{k,k}$

where the RHS is the Hadamard Determinant Inequality, which we can see is met with equality, and for HPD matrices this occurs iff said matrix is diagonal.

proof 2
Again using HPD $B$ where the first $r$ diagonal components are equal to $\lambda_\text{max}$
$B = \bigg[\begin{array}{c|c|c|c|c|c} \mathbf b_1 &\cdots &\mathbf b_r& \mathbf b_{r+1}&\cdots & \mathbf b_{n} \end{array}\bigg]$

now consider the diagonal component $k$ of $B^2$ for $k \in\{1,2,...,r\}$
$\lambda_\text{max}^2= (b_{k,k})^2 \leq \big\Vert\mathbf b_k\big\Vert_2^2=(B^2)_{k,k}\leq \lambda_\text{max}^2$
where the lower bound follows by length computation and the upper bound is stated in the OP's question. Since these are met with equality we know $\mathbf b_k = \lambda_\text{max}\mathbf e_k$ (standard basis vector $\mathbf e_k$) so we have

$ B= \begin{bmatrix}\lambda_\text{max}I_r & \mathbf 0 \\ \mathbf 0& B^{'}\end{bmatrix}$

as for $B^{'}$, consider for $k\in\{r+1,r+2, ...,n\}$
we have $b_{k,k} =\lambda_\text{min}=\lambda_n$ and
$\lambda_n \leq \lambda_k $

summing over the bound:
$(n-r)\cdot \lambda_n \leq \lambda_{r+1} + \lambda_{r+2}+...+\lambda_n= \text{trace}\big(B^{'}\big) = (n-r)\cdot \lambda_n$
since this is met with equality, we know
$\lambda_{r+1} = \lambda_{r+2}=...=\lambda_n$

Finally, since $B^{'}$ is Hermitian we know it is diagonalizable and is given by
$B^{'} = Q\big(\lambda_n I_{n-r}\big)Q^{-1}=\lambda_n I_{n-r} QQ^{-1}=\lambda_n I_{n-r}$
which completes the proof

0
On

By considering $A-\lambda_\min(A)I$ and by by permuting the rows and columns of $A$, we may assume that $A$ is singular and positive semidefinite, $a_{11}=\cdots=a_{rr}=\lambda_\max$ and $a_{r+1,r+1}=\cdots=a_{nn}=\lambda_\min=0$.

Since $A$ is positive semidefinite, $\lambda_\max=\|A\|_2$. Denote the standard basis of $\mathbb C^n$ by $\{e_1,e_2,\ldots,e_n\}$. When $1\le i\le r$, we have $\lambda_\max=\|A\|_2\ge\|Ae_i\|_2\ge\|a_{ii}e_i\|_2=a_{ii}=\lambda_\max$. Therefore $Ae_i=a_{ii}e_i=\lambda_\max e_i$, i.e. $A=\lambda_\max I_r\oplus H$ for some $(n-r)$-rowed Hermitian matrix $H$.

By assumption, $a_{r+1,r+1}=\cdots=a_{nn}=0$. Hence $\operatorname{tr}(H)=0$. It follows that if $H\ne0$, it must possess a negative eigenvalue. Yet this is impossible, because every eigenvalue of $H$ is an eigenvalue of the positive semidefinite matrix $A$. Therefore $H$ must be zero and $A=\lambda_\max I_r\oplus0$ is a diagonal matrix.