Suppose $A=(a_{ij})$ is $n\times n$ Hermitian. Show that for all $1\leq i \leq n$;
if $a_{ii} = \lambda_{\max}$ or $\lambda_{\min}$, then $A$ is diagonal.
I've already proven elsewhere that the diagonals of $A$ must lie between $\lambda_{\min}$ and $\lambda_{\max}$, and I'm pretty sure I want to combine this with the min-max theorem to get the desired result, but it's not clear to me how I can use these results to say something about the off-diagonals of $A$ to show they're zero.
Any thoughts would be greatly appreciated. Thanks in advance.
proof 1
First consider
$B := A + \gamma I\succ 0$
for some large enough $\gamma \gt 0$
then if $A$ has its min and max eigenvalues on the diagonal, so does $B$. Of course $B$ and $A$ have the same off diagonal entries, so it suffices to prove that HPD $B$ must be diagonal.
suppose $r$ of the diagonal components of $B$ are equal to $\lambda_\text{max}$. And using permutation matrices to effect similarity transforms, we can assume WLOG that these are the first r diagonal components of $B$. Then with eigenvalues in the usual ordering, $\lambda_1 \geq \lambda_2 \geq... \geq \lambda_n$
for $m \in\{1,2,...,n-1,n\}$
$\sum_{k=1}^m b_{k,k} \geq \sum_{k=1}^m \lambda_k$
with equality for $m=n$ (check the trace)
so the diagonal of $B$ majorizes its eigenvalues. Using Schur concavity of the nth elementary symmetric polynomial we have
$\prod_{k=1}^n b_{k,k}\leq \prod_{k=1}^n \lambda_k = \det\big(B\big)\leq \prod_{k=1}^n b_{k,k}$
where the RHS is the Hadamard Determinant Inequality, which we can see is met with equality, and for HPD matrices this occurs iff said matrix is diagonal.
proof 2
Again using HPD $B$ where the first $r$ diagonal components are equal to $\lambda_\text{max}$
$B = \bigg[\begin{array}{c|c|c|c|c|c} \mathbf b_1 &\cdots &\mathbf b_r& \mathbf b_{r+1}&\cdots & \mathbf b_{n} \end{array}\bigg]$
now consider the diagonal component $k$ of $B^2$ for $k \in\{1,2,...,r\}$
$\lambda_\text{max}^2= (b_{k,k})^2 \leq \big\Vert\mathbf b_k\big\Vert_2^2=(B^2)_{k,k}\leq \lambda_\text{max}^2$
where the lower bound follows by length computation and the upper bound is stated in the OP's question. Since these are met with equality we know $\mathbf b_k = \lambda_\text{max}\mathbf e_k$ (standard basis vector $\mathbf e_k$) so we have
$ B= \begin{bmatrix}\lambda_\text{max}I_r & \mathbf 0 \\ \mathbf 0& B^{'}\end{bmatrix}$
as for $B^{'}$, consider for $k\in\{r+1,r+2, ...,n\}$
we have $b_{k,k} =\lambda_\text{min}=\lambda_n$ and
$\lambda_n \leq \lambda_k $
summing over the bound:
$(n-r)\cdot \lambda_n \leq \lambda_{r+1} + \lambda_{r+2}+...+\lambda_n= \text{trace}\big(B^{'}\big) = (n-r)\cdot \lambda_n$
since this is met with equality, we know
$\lambda_{r+1} = \lambda_{r+2}=...=\lambda_n$
Finally, since $B^{'}$ is Hermitian we know it is diagonalizable and is given by
$B^{'} = Q\big(\lambda_n I_{n-r}\big)Q^{-1}=\lambda_n I_{n-r} QQ^{-1}=\lambda_n I_{n-r}$
which completes the proof