I have a situation where a matrix $A=[a_{ij}]$ arises.
From the physics of the problem, I expect this matrix to have one null eigenvalue, while the remaining eigenvalues have negative real part. However I have not been able to prove the second part of the statement.
I appreciate any help / hint /guidance on how to approach the problem.
The off-diagonal elements are given by $$ a_{ij} = \begin{cases} n_{ji}, & \text{if } j<i \\ n_{ji}+1, & \text{if } j>i \end{cases} $$
Whereas the diagonal elements are given by $ a_{ii} = -\sum_{k \neq i =1}^N a_{ki} $.
It is clear that the row vector of all-ones $\mathbb 1_N$ is always a left eigenvector with null eigenvalue.
Furthermore, the $n_{ij}$ are such that:
$\bullet$ $ n_{ij}>0 $
$\bullet$ $ n_{ij} $ increase with $i: n_{ij} < n_{(i+1)j}$
$\bullet$ $ n_{ij} $ decrease with $j: n_{ij} > n_{i(j+1)}$
Case N=2
$$ \begin{bmatrix} -n_{12} & 1+n_{12} \\ n_{12} & -1-n_{12} \\ \end{bmatrix} $$
The eigenvalues are $0$ and $-1-2 n_{12}$.
Case N=3
$$ \begin{bmatrix} -n_{12}-n_{13} & 1+n_{12} & 1+n_{13} \\ n_{12} & -1-n_{12}-n_{23} & 1+n_{23} \\ n_{13} & n_{23} & -2-n_{13}-n_{23} \\ \end{bmatrix} $$
The non-zero eigenvalues are given by (after some messy computations, or after asking Wolfram Mathematica):
$$ (-3 - 2 n_{12} - 2 n_{13} - 2 n_{23} \pm \sqrt{1 - 4 n_{12} + 4 n_{12}^2 - 4 n_{12} n_{13} + 4 n_{13}^2 + 4 n_{23} - 4 n_{12} n_{23} - 4 n_{13} n_{23} + 4 n_{23}^2})/2 $$ We can see that the real part must be negative by noting that $(3 + 2 n_{12} + 2 n_{13} + 2 n_{23})^2$ is strictly greater than $(1 - 4 n_{12} + 4 n_{12}^2 - 4 n_{12} n_{13} + 4 n_{13}^2 + 4 n_{23} - 4 n_{12} n_{23} - 4 n_{13} n_{23} + 4 n_{23}^2)$.
As $N$ increases, the eigenvalue computations becomes messier... Is there a simpler way to show that they will be negative?
I think I found an answer:
First, use the Gershgorin circle theorem as in Gregory's answer to show that the eigenvalues of $A$ have non-positive real part.
Second, note that $A$ has at least one null eigenvalue since the rows of the matrix add up to $0$.
Third, show that $A$ has only one null eigenvalue:
Construct an auxiliary $N\times N$ matrix $B$, equal to $A$, but with the last row zeroed. Since row operations do not change the nullity of a matrix, $B$ has the same number of null eigenvalues as $A$. $$ B = \begin{bmatrix} a_{11} & 1+n_{12} & \cdots & 1+n_{1(N-1)} & 1+n_{1N} \\ n_{12} & a_{22} & \cdots & 1+n_{2(N-1)} & 1+n_{2N} \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ n_{1(N-1)} & n_{2(N-1)} & \cdots & a_{(N-1)(N-1)} & 1+n_{(N-1)(N-1)} \\ 0 & 0 & \cdots & 0 & 0 \end{bmatrix} $$
Construct a second auxiliary $(N-1) \times (N-1)$ matrix $C$, equal to $B$ without the last row and column. Note that the eigenvalues of $C$ are also eigenvalues of $B$ since you can construct the eigenvectors of $B$ by padding the eigenvectors of $C$ with a $0$.
$$ \text{Let } v=[v_1 \cdots v_{N-1}] : Cv = \lambda v $$ $$ \text{Note } w=[v_1 \cdots v_{N-1} \ 0] \Rightarrow Bw = \lambda w $$
Finally, use once again the Gershgorin circle theorem to show that all the eigenvalues of $C$ are strictly negative. $$ |\lambda - a_{ii}| \leq \sum_{k\neq i} c_{ki} = |a_{ii}| - n_{iN} < |a_{ii}| $$
$C$ has no null eigenvalues $\Rightarrow$ $B$ has only one null eigenvalue $\Rightarrow$ $A$ has only one null eigenvalue $\blacksquare$