Let $A$ be an $n \times n$ matrix with all nonnegative entries and row sums strictly less than one, let $V$ be an $n \times n$ nonnegative diagonal matrix satisfying $V \leq I$ (entrywise), let $B\equiv\left(I-AV\right)^{-1}$ and finally let $X$ be a vector in the $n$-dimensional simplex, i.e., $x_j \geq 0,\sum_j^n x_j=1$. Consider the matrix $$M \equiv \left(\mathrm{diag}\left\{ B^{T}X\right\} \right)^{-1}B^{T}\left[ V\mathrm{diag}\left\{ X\right\} + (I-V) \mathrm{diag}\left\{ B^{T}X\right\} \right]B\mathrm{diag}(\iota-A \iota),$$ where $\mathrm{diag}\left(u\right)$ is the diagonal matrix formed from vector $u$ and $\iota$ is the vector of all ones. I want to show that the spectral radius of $M$ is (weakly) lower than one, $\rho(M)\leq 1$.
Two simple cases are illustrative. First, if $V = I $ then $M\iota = \iota$ and so $\rho(M)=1$. Second, if $A$ is diagonal then $M$ would be diagonal and so we would just need to show that each diagonal element is lower than one. But each of diagonal element of $M$ would be of the form $$\left(v+\frac{1-v}{1-av}\right)\frac{1-a}{1-av},$$ which is readily shown to be lower than one.
The problem above, namely showing that $\rho(M)\leq 1$, comes from a more general problem, which I ultimately need to solve. Let $D_1$ and $D_2$ be two strictly positive diagonal $n \times n$ matrices and let $$\tilde{M}\equiv\left(\mathrm{diag}\left\{ B^{T}X\right\} \right)^{-1}B^{T}\left[V\mathrm{diag}\left\{ X\right\} +\left(I-V\right)\mathrm{diag}\left\{ B^{T}X\right\} \right]D_{1}B\mathrm{diag}\left(\iota-A\iota\right)D_{2}.$$ I want to show that $\rho(\tilde{M})\leq 1$ provided that $$ \tag{*} D_{1}\left(I-A\right)^{-1}\mathrm{diag}\left(\iota-A\iota\right)D_{2}\iota\leq\iota.$$ This is now posted as a separate question here: Bounding spectral radius of special matrix (extension)
The simpler question stated above obtains from this more general question in the special case in which $D_{k}=d_{k}I,k=1,2$ with $d_1,d_2$ being positive scalars. In that case $$\tilde{M}=\left(\mathcal{\mathrm{diag}}\left\{ B^{T}X\right\} \right)^{-1}B^{T}\left[V\mathcal{\mathrm{diag}}\left\{ X\right\} +\left(I-V\right)\mathcal{\mathrm{diag}}\left\{ B^{T}X\right\} \right]B\mathrm{diag}\left(\iota-A\iota\right)d_{1}d_{2},$$ while condition (*) simply becomes $d_{1}d_{2} \leq 1$, and so we can simply prove that $\rho(M)\leq 1$.
Here is an answer using ideas of my solution to the linked question.
First recall that $B=\sum_n (AV)^n$ has positive elements and that $B=AVB+I$ implies that $$\tag1 \sum_j a_{rj}v_jb_{jn}=b_{rn}-\delta_{rn}$$ where $\delta_{rn}=1$ if $r=n$ and $=0$ otherwise.
Let us now introduce some notation: $D_B=\newcommand{\diag}{\mbox{ diag}}\diag(B^TX)$, $D_A=\diag(\iota-s)$ with $s=A\iota$ and $D_X=\diag(X)$.
Then $M=D_B^{-1}B^T[VD_X+(I-V)D_B]BD_A$ is only defined if $B^TX$ has nonzero components. This will be assumed in the beginning. It is the case if $X$ has only positive components, but might be wrong for certain $X$, for example if $A$ and hence $B$ are block-triangular.
We want show that its spectral radius $\rho(M)\leq1$. This is equivalent to showing that $\rho(M')\leq1$for the symmetric matrix $$M'=D_B^{-1/2}D_A^{1/2}B^T[VD_X+(I-V)D_B]BD_B^{-1/2}D_A^{1/2}$$ similar to $M$. This in turn is equivalent to showing that $M'-I$ is negative semidefinite. Multiplying with $D_B^{1/2}D_A^{-1/2}$ from the left and the right, this is equivalent to showing that $$M''=B^T[VD_X+(I-V)D_B]B-D_BD_A^{-1}$$ is negative semidefinite. This statement makes sense for all $X$ in the n-dimensional simplex. Since $D_X$ and $D_B$ are linear functions of $X$ and $X$ has nonnegative elements, if suffices to show this for the unit vectors $X=e_j$, $j=1,\ldots,n$. So we have to show that the matrices $$L_j=B^T[VE_j+(I-V)D_j]B-D_jD_A^{-1}$$ are negative semidefinite where $E_j=\diag(e_j)$ and $D_j=\diag(B^Te_j)=\diag(b_{j1},\ldots,b_{jn})$. Fixing an arbitrary vector $w$, we have to show that $z_j:=w^TL_jw\leq0$ for all $j$.
As in the answer to the linked question, it is sufficient for this to show that $$\tag2 \sum_ja_{rj}v_jz_j\geq z_r\mbox{ for all }r,$$ because this implies that the vector $z$ of the $z_r$ can be expressed as $z=-\sum_n (AV)^n d$ with the vector $d$ of the nonnegative differences $d_r=\sum_j a_{rj}v_jz_j - z_r.$
For a proof of (2), we rewrite $z_j$ using the vector $u=Bw$. We have $$z_j=w^TL_jw=u^TVE_ju+u^T(I-V)D_ju-w^TD_jD_A^{-1}w\\ =v_ju_j^2+\sum_m u_m^2(1-v_m)b_{jm}-\sum_m w_m^2b_{jm}/(1-s_m).$$ Using (1), we calculate $$\sum_j a_{rj}v_jz_j=\sum_j a_{rj}v_j^2u_j^2+\sum_m u_m^2(1-v_m)(b_{rm}-\delta_{rm})-\sum_m w_m^2(b_{rm}-\delta_{rm})/(1-s_m)\\ =\sum_j a_{rj}v_j^2u_j^2+z_r-v_ru_r^2-u_r^2(1-v_r)+w_r^2/(1-s_r)\\ =z_r+d_r,\mbox{ where }d_r=\sum_j a_{rj}v_j^2u_j^2-u_r^2+w_r^2/(1-s_r).$$
For a proof of the nonnegativity of all $d_r$, the first idea is to use (1) to calculate $$\sum_j a_{rj}v_ju_j=\sum_{j,n} a_{rj}v_jb_{jn}w_n=\sum_n (b_{rn}-\delta_{rn})w_n=u_r-w_r.$$ Next, we use the Cauchy-Schwarz inequality $\left(\sum_jx_jy_j\right)^2\leq\left(\sum_j x_j^2\right)\left(\sum_j y_j^2\right)$ with $x_j=\sqrt{a_{rj}}$ and $y_j=\sqrt{a_{rj}}v_ju_j$ to show $$\left(\sum_j a_{rj}v_ju_j\right)^2\leq \left(\sum_ja_{rj}\right) \left(\sum_j a_{rj}v_j^2u_j^2\right)=s_r\sum_j a_{rj}v_j^2u_j^2.$$ Hence, except in the exceptional case $s_r=0$, we can estimate $$d_r\geq\frac1{s_r}(u_r-w_r)^2-u_r^2+\frac{w_r^2}{1-s_r} =\frac{(1-s_r)(u_r-w_r)^2-s_r(1-s_r)u_r^2+s_rw_r^2}{s_r(1-s_r)}\\ =\frac{\left((1-s_r)u_r-w_r\right)^2}{s_r(1-s_r)}\geq0.$$ This completes the proof, except in the exceptional case.
If $s_r=0$, then all $a_{rj}=0$, $j=1,\ldots,n$. Then $b_{rj}=\delta_{rj}$,$j=1,\ldots,n$, hence $u_r=w_r$ and $z_r=0$. Therefore (2) is trivially true in the exceptional case.