Does there exist a stochastic matrix $P$ such that $\ker(P-I)$ contains no nonnegative vector?

162 Views Asked by At

In the proof of the existence and uniqueness of a stationary distribution of a finite-state discrete Markov chain I came across the construction of a distribution vector $v \in \mathbb{R}^n$ with $v = vP$ for a stochastic matrix (or equivalently $v \geq 0$ with $v \in \ker((P-I)^T)$). This is stated with needing a bit of "luck" to obtain such a vector, which leads me to believe that the existence of such a vector is not always given for a fixed transition matrix $P$.

I already tried to set $P$ such that it corresponds to a chain $1 \rightarrow 2 \rightarrow \ldots \rightarrow n \rightarrow 1$ with transition probability $1$; that at least fixed the subspace $\ker((P-I)^T)$ to only contain vectors with entries being the same, but as I quickly noticed this didn't exclude entries $\geq 0$.

As I am bit pressed for time, I would like to ask:

For any given $n \in \mathbb{N}$, does there exist a stochastic matrix $P \in [0,1]^{n \times n}$ such that $\ker((P-I)^T)$ only contains nonpositive vectors?

2

There are 2 best solutions below

1
On BEST ANSWER

Let $P$ be any $n\times n$ stochastic matrix. Then $\mathbf x\mapsto\mathbf xP$ induces a bounded linear map from the non-empty, compact convex set of probability measures $$\Sigma:=\left\{\mathbf x\in\mathbb R^n:x_i\ge0\text{ and }\sum_{i=1}^nx_i=1\right\}$$ to itself. Take any $\mathbf x\in\Sigma$ (for instance, $\mathbf x:=(\frac1n,\ldots,\frac1n)$). It is easy to see* that the sequence $$\mathbf x_k:=\frac1k\sum_{k=0}^{k-1}\mathbf xP^k\in\Sigma,\quad k\ge1,$$ converges as $k\to\infty$ to some element $\mathbf x_\infty\in\Sigma$ such that $\mathbf x_\infty P=\mathbf x_\infty$. Thus $\mathbf x_\infty$ is a nonnegative left-eigenvector of $P$.

* Try to prove this and if you don't succeed, have a look at Markov-Kakutani's theorem.

0
On

First of all, for some any matrix $P\in\mathbb{R}^{n\times n}$, we have that $x\in\mathrm{ker}(P-\lambda I)$, for some $\lambda\in\mathbb{C}$, if and only if $\alpha x\in\mathrm{ker}(P-\lambda I)$ for all $\alpha\in\mathbb{R}$.

Then, we use the properties of nonnegative matrices. If $P$ is a nonnegative matrix, then there exists $v\ge0$ such that $Pv=\lambda_{PF}v$ where $\lambda_{PF}$ is the so-called dominant (or Perron-Frobenius) eigenvalue of $P$.

In the context of a stochastic matrix and assuming that it is irreducible, the (simple) dominant eigenvalue is 1 and $v\ge0$ will be such that $v\in\mathrm{ker}((P-I)^T)$. However, $v$ needs to be a probability vector and must sum to one.

Therefore, the unique stationary distribution will be $v/||v||_1$.

In fact, all stochastic matrices having a simple eigenvalue at 1 will have a unique nonnegative left-eigenvector associated with the eigenvalue 1 that sums to one. This vector will be the stationary distribution.