An inequality using column sums of inverse matrices

270 Views Asked by At

I want to prove a matrix analogue to inequality $\left(\frac{1-x(1-\alpha)}{\alpha}\right)^{\alpha} x^{1-\alpha}$ for $\alpha \in [0,1)$ and $x \in [0,1]$, which has a nice proof using GM-AM, as shown in https://math.stackexchange.com/a/4176762/165163.

Let $A=\{a_{ij}\}$ be an $N \times N$ strictly substochastic matrix, $\alpha_i \equiv 1 - \sum_j a_{ij} $, $x$ be a scalar, and $f(x):\mathbb{R} \rightarrow \mathbb{R}$ be a function defined by $$f(x) = \prod_{i}\left(\frac{\sum_{l}w_{li}(1)}{\sum_{l}w_{li}(x)}\right)^{\frac{\sum_{j}w_{ji}\left(1\right)\alpha_{i}}{\bar{w}}}x^{\frac{\bar{w}-N}{\bar{w}}},$$ where $w_{ij}(x)$ are the elements of matrix $W(x) \equiv \left(I-xA\right)^{-1}$ and $\bar{w}\equiv\sum_{i,j}w_{ij}$.

I want to show that $f(x) \leq 1$ for all $x\in[0,1]$.

It is clear that $f(1) = 1$ and that $f(0) = 0.$ Also, note that $\sum_i w_{ji}(1) \alpha_i = 1$ for all $j$, which follows from $$ \{ \sum_i w_{ji}(1) \alpha_i \} = W(1)\left(I-\mathcal{D}\left\{ A\iota\right\} \right)\iota=W(1)\left(I-A\right)\iota=\left(I-A\right)^{-1}\left(I-A\right)\iota=\iota.$$ This implies that the sum of the powers in the terms above is 1, $$\sum_i \frac{\sum_j w_{ji}(1)\alpha_i}{\bar{w}} + \frac{\bar{w}-N}{\bar{w}} = 1.$$ Thus, we can think of $f(x)$ as a geometric mean. But the arithmetic mean corresponding to this geometric mean, namely $$ \sum_{i} \left(\frac{\sum_{j}w_{ji}\left(1\right)\alpha_{i}}{\bar{w}}\right) \left(\frac{\sum_{l}w_{li}(1)}{\sum_{l}w_{li}(x)}\right) + \left(\frac{\bar{w}-N}{\bar{w}} \right) x $$ does not equal one, so the GM-AM approach didn't work immediately in this case, except in the case in which $A$ is diagonal, where the expression just below collapses to $$ \sum_{i} \left(\frac{1}{\bar{w}}\right) \left(\frac{1-x (1-\alpha_i)}{\alpha_i}\right) + \left(\frac{\bar{w}-N}{\bar{w}} \right) x =1, $$ and so the desired inequality follows from the GM-AM.

Simulations suggest that the inequality holds -- in fact, they suggest that $f(x)$ is always increasing. I could try to show $f'(x) \geq 0$ but this would limit how much I can then generalize, for example to having $x$ be a vector, in which case $f(x):\mathbb{R}^N \rightarrow \mathbb{R}$ is given by $$f(x)= \prod_{i}\left(\frac{\sum_{l}w_{li}(1)}{\sum_{l}w_{li}(x)}\right)^{\frac{\sum_{j}w_{ji}\left(1\right)\alpha_{i}}{\bar{w}}} \prod_i x_i^{\frac{\sum_j w_{ij}-N}{\bar{w}}},$$ where $w_{ij}(x)$ are now the elements of matrix $\left(I-A \mathrm{diag}(x)\right)^{-1}$

Any suggestions would be much appreciated.