Prove $(I +A)^{n-1}$ is entrywise positive if $A$ is entrywise non-negative.

165 Views Asked by At

Let $A$ be an $n \times n$ real matrix that is entrywise non-negative. I know that for each $i$ and $j$, $a_{ij}^{(k)} \neq 0$ for some positive integer $k$, where $a_{ij}^{(k)}$ denotes the $ij$ entry of $A^k$. I want to show that $(I + A)^{n-1}$ is entrywise strictly positive.

My thoughts: Since $A$ is entrywise non-negative, $A^k$ is also entrywise non-negative for all $k$, i.e. $a^{(k)}_{ij} \ge 0$ for all $i, j$ and for all $k$. Since for each $i$ and $j$ that $a^{(k)}_{ij} \neq 0$ for some $k$, $a^{(k)}_{ij} > 0$. Let $B = (I_n + A)^{n-1}$. By the binomial theorem, $$ B = (I_n + A)^{n-1} = I_n + (n-1)A + \frac{(n-1)(n-2)}{2}A^2 + \dots + A^{n-1} $$ Let $b_{ij}$ be the $ij$ entry of $B$. For each $i$ and $j$, $b_{ij}$ is the sum of a finite number of non-negative numbers. The problem I am having is that I don't know the relationship between $k$ and $n$. If $k < n$, then $b_{ij}$ the sum of a finite number of non-negative numbers and a positive number, which implies $b_{ij} > 0$. But if $k \ge n$, I cannot say that. Can someone point out how I should use the condition that for each $i$ and $j$ that $a^{(k)}_{ij} \neq 0$ for some $k$, $a^{(k)}_{ij} > 0$? Thanks in advance!

2

There are 2 best solutions below

0
On

The following statement can be proved by induction on the order of matrix $A$. If all entries of matrix $A$ of order $n>1$ are non-negative and and all entries of the main diagonal are positive all entries of matrix $A^k$ are positive for some $k>0$, then there exists $0<l<n$ such that all entries of matrix $A^l$ are positive.

Let $$ A= \begin{pmatrix} A' & X\\ Y & s \end{pmatrix},\qquad B= \begin{pmatrix} B' & U\\ V & t \end{pmatrix}, $$ where $A'$ and $B'$ are matrices of order $n-1$. We have $$ AB= \begin{pmatrix} A'B'+XV & A'U+tX\\ YB'+sV & YU+st \end{pmatrix}. $$

From this formula we see that if all entries of matrix $A^k$ are positive for some $k>0$, then the same is true for entries of matrix $A'^k$. Hence, by induction, there exists $0<l<n-1$ for which all entries of $A'^l$ are positive.

The same formula shows that if any entry of the last column of matrix $A^{l+1}=A\cdot A^l$ is zero, then all entries of the last column of matrix $A$ are zero, except maybe $a_{nn}$. But then it is not true that all entries of $A^k$ are positive.

The same reasoning holds for the last row of the matrix $A^{l+1}=A^l\cdot A$.

Let us note that in the general case (without condition on diagonal entries) this statement is not true. For example, $$ A= \begin{pmatrix} 1 & 1\\ 1 & 0 \end{pmatrix},\qquad A^2= \begin{pmatrix} 2 & 1\\ 1 & 1 \end{pmatrix}. $$ We have $k=2=n$.

0
On

I am providing an answer based on structural analysis or graph theory. To the nonnegative matrix $A$ we can associate a graph $(V,E)$ such that $V=\{1,\ldots,n\}$ and $(i,j)\in E$ if $a_{ji}>0$.

So, this means that if $(i,j)\in E$ if and only if there is a path of length 1 from the node $i$ to node $j$.

Similarly, let $a_{ij}^{(k)}$ be the $(i,j)$-the entry of $A^k$. If this entry $a_{ij}^{(k)}$ is positive then there is a path of length $k$ from the node $j$ to the node $i$.

Note that we need paths of at most length $n-1$ to cover the whole graph starting from any node. Therefore, if a node is not reached after $n-1$ steps, then it is not reachable at all.

Check for instance, the matrix

$$A=\begin{bmatrix}0 & 1& 0\\1 & 0 & 0\\0 & 0 & 0\end{bmatrix}.$$

This means that the condition that for each pair $(i,j)$, $i\ne j$, there exists a $k>0$ such that $a_{ij}^{(k)}>0$ is equivalent to the statement that there exists a $k=1,\ldots,n-1$ such that $a_{ij}^{(k)}>0$.

In this regard, the sum $A+A+A^2+\ldots+A^{n-1}$ will contain positive off-diagonal entries and, as a result $I+A+A+A^2+\ldots+A^{n-1}$ will be a positive matrix.

This then implies that $(I+A)^{n-1}$ is a positive matrix using the binomial theorem formula.

Finally, it is interesting to note that the condition is equivalent to say that the graph is strongly connected which is, in turn equivalent to the fact that the matrix $A$ is irreducible.