While experimenting in Matlab I found something that I quite can't understand.
I am generating random symmetric positive-definite matrices $A$ with both positive and negative entries, and then splitting them in their element-wise positive part $B = A_+$ and $C = A_-$.
I then "normalize" both matrices $B,C$ by multiplying them with the diagonal matrix $N$:
\begin{equation*} N_{ii} = \dfrac{1}{1 + \sum_i |A_{ij}|} \end{equation*}
obtaining the "normalized" matrices $NB$, $NC$. While $NA$ has all positive eigenvalues, I know that $NB$, $NC$ need not, and in fact, I found many examples of $A$ lending $NB$ with both positive and negative eigenvalues.
I think that the smallest eigenvalue of $NB$ should be bound by $\lambda_{NB,min} > -1$ since the maximum row sum is bounded by $1$, but what seems unexplicable to me is it seems instead that the lower bound is somewhat higher.
Particularly, I have never found any matrix lending $\lambda_{NB,min} < -0.1$.
I think that this might be due to some bias on how I generate matrices $A$, but could not come up with a suitable counterexample to prove that the smallest eigenvalue is bounded only by $-1$, if true. I tried fiddling with the generator in order to obtain Gershgorin circles spanning the negative axis, but still the eigenvalue never seems to go below this "magic" threshold.
Am I missing some elephant in the room?
Add 22/4/20
After some additional thinking I may have gotten a somewhat better bound. Write $B$ as $B = \frac12 (|A| + A)$, then from Gershgorin theorem: \begin{equation}\lambda_{min} > \min(NB_{ii} - \sum\limits_{i\neq j} NB_{ij})\end{equation} \begin{equation}\lambda_{min} > \min\bigg(\frac{A_{ii}}{1 + \sum_i|A_{ij}|} - \frac{\frac12\sum\limits_{i \neq j}(|A|_{ij}+A_{ij})}{1 + \sum_i|A_{ij}|} \bigg)\end{equation} The out-of-diagonal term is for sure bounded by 1, so: \begin{equation}\lambda_{min} > \min(\frac{A_{ii}}{1 + \sum_i|A_{ij}|} - 1)\end{equation}
Can this be improved without adding hypotheses?