Let $A$ be the generator matrix of a continuous-time Markov chain. This means that $A$ has positive off-diagonal elements $A_{ij} > 0$, $i \ne j$, and row sums $\sum_j A_{ij}$ equal to $0$. For example, $A$ could be $$ A = \left( \begin{matrix} -7 & 4 & 3 \\ 1 & -2 & 1 \\ 3 & 5 & -8 \end{matrix} \right). $$ I am interested in proving the following claim about the matrix $B = x(x I - A)^{-1}$ for some $x > 0$.
Claim. For the matrix $B = x(x I - A)^{-1}$, it holds that the diagonal elements of $B - B^2$ are non-negative.
Using numerical simulations I have convinced myself that this claim is likely true; however, I have not been able to make much progress toward proving it.
It is straightforward to show that the matrix $B$ is stochastic. However, the claim above is not true for all stochastic matrices $B$; there is something special about stochastic matrices of this particular form.
Any ideas?
Let me continue from @user1551's idea and show that indeed $\mathbf{B} - \mathbf{B}^2$ has non-negative diagonals.
Step 1. We recap @user1551's reduction.
Let $\mathbf{A}$ be a transition-rate matrix as in OP, and let $\mathbf{B} = ( \mathbf{I} - \mathbf{A})^{-1}$. (Here, we are assuming $x=1$ without losing the generality.) Choose $d > 0$ sufficiently large so that
$$ \mathbf{P} = \mathbf{I} + \frac{1}{d} \mathbf{A} $$
is a stochastic matrix. Solving this for $\mathbf{A}$ gives $\mathbf{A} = d(\mathbf{P} - \mathbf{I})$, hence
\begin{align*} \mathbf{B} - \mathbf{B}^2 &= -\mathbf{A}(\mathbf{I} - \mathbf{A})^{-2} \\ &= \frac{d}{(d+1)^2} (\mathbf{I} - \mathbf{P})(\mathbf{I} - c \mathbf{P})^{-2} \end{align*}
where $c = \frac{d}{d+1} \in (0, 1)$. In light of this, it suffices to prove:
Step 2. Let $X = (X_n)_{n\geq 0}$ denote a Markov chain with the transition matrix $\mathbf{P}$. Fix a state $j$, and define $(\tau_k)_{k\geq 0}$ as the sequence of return times to state $j$. More precisely,
$$ \tau_0 = 0 \qquad \text{and} \qquad \tau_{k+1} = \inf\{ n > \tau_k : X_n = j \}. $$
If $\mathbb{P}_j$ denotes the law of $X$ started at $j$, then
\begin{align*} \mathbf{e}_j^{\top} (\mathbf{I} - c \mathbf{P})^{-1} \mathbf{e}_j &= \sum_{n=0}^{\infty} c^n (\mathbf{e}_j^{\top} \mathbf{P}^n \mathbf{e}_j) = \sum_{n=0}^{\infty} c^n \mathbb{P}_j(X_n = j) \\ &= \sum_{n=0}^{\infty} \sum_{k=0}^{\infty} c^n \mathbb{P}_j(\tau_k = n) = \sum_{k=0}^{\infty} \mathbb{E}_j[c^{\tau_k}] = \sum_{k=0}^{\infty} \mathbb{E}_j[c^{\tau_1}]^k \\ &= \frac{1}{1 - \mathbb{E}_j[c^{\tau_1}]}. \end{align*}
This computation is related to the original problem as follows:
\begin{align*} \frac{\mathrm{d}}{\mathrm{d}c} \frac{1 - c}{1 - \mathbb{E}_j[c^{\tau_1}]} &= \frac{\mathrm{d}}{\mathrm{d}c} \mathbf{e}_j^{\top} (1 - c)(\mathbf{I} - c \mathbf{P})^{-1} \mathbf{e}_j \\ &= - \mathbf{e}_j^{\top} (\mathbf{I} - \mathbf{P})(\mathbf{I} - c\mathbf{P})^{-2} \mathbf{e}_j. \end{align*}
Step 3. The above relation shows that the claim is equivalent to showing:
This is the same as showing $1/f(1-x)$ is non-increasing for $x \in (0, 1)$. However,
\begin{align*} \frac{1}{f(1-x)} &= \frac{1 - \mathbb{E}_j[(1-x)^{\tau_1}]}{x} = \int_{0}^{1} \mathbb{E}_j[ \tau_1 (1 - xt)^{\tau_1 - 1} ] \, \mathrm{d}t \end{align*}
It is clear that $x \mapsto \tau_1 (1 - xt)^{\tau_1 - 1}$ is non-increasing for each $t \in [0, 1]$, hence the claim is proved. $\square$