Why does from Perron-Frobenius follow that that constant functions are the only harmonic functions here?

263 Views Asked by At

in our reading we had the following example for a Markov chain.

State Space $E=\left\{1,2,3,4\right\}$ and Transition Matrix $$ P=\frac{1}{3}\begin{pmatrix}0 & 1 & 1 & 1\\1 & 0 & 1 & 1\\1 & 1 & 0 & 1\\1 & 1 & 1 & 0\end{pmatrix}. $$

Now all harmonic function $f\colon E\to\mathbb{R}$ are searched, that is all function that fullfill $$ Pf(x)=\sum_{y\in E}p_{x,y}f(y)=f(x). $$

It is clear that all constant functions are harmonic.

But then it is said: By Perron-Frobenius and the fact that $P$ is irreducible and aperiodic, it follows that there are no other harmonic functions.

I do not understand how that follows that there are no other harmonic functions then the constant functions.

Here is how we wrote down the Perron-Frobenius Theorem:

Let $A=(a_{i,j})_{i,j\in E}$ denote a finite irreducible and non-negative Matrix. Then there exists a positive Eigenvalue $\lambda >0$ of $A$ with the following properties: (1) $\lambda$ is a simple root of the characteristic polynomial. (2) If $\kappa$ is another Eigenvalue of $A$, then $\lvert\kappa\rvert\leq\lambda$. (3) $\lambda$ has a positive eigenvector $h$, moreover if a function $f\colon E\to [0,\infty)$ satisfies $Af\leq\lambda f$ then $f=c\cdot h$ for some constant $c$. (4) It exists a strictly positive measure $\nu$ on $E$ that spans the left eigenspace of $A$ with respect to $\lambda$. Furthermore if a non-negative measure $\mu$ on $E$ satisfying $\mu A\leq\mu$, then $\mu=c\cdot\nu$ for some constant $c$. (5) If $A$ is irreducible and aperiodic and $h$ and $\nu$ as above are normalized such that $\sum_{i\in E}h_i\nu_i=1$, then $\lvert\kappa\rvert <\lambda$ for every $\kappa\in\text{spec}(A)\setminus\left\{\lambda\right\}$ and $\lim_{n\to\infty}\frac{a_{i,j}^{(n)}}{\lambda^n}=h_i\nu_i$ for all $i,j\in E$.

1

There are 1 best solutions below

7
On BEST ANSWER

Firstly, we consider the function $f$ as a column vector. Then $f$ is harmonic if it satisfies the equation $P\cdot f = f$, i.e. $$P \cdot \begin{bmatrix} f(1)\\ f(2) \\ f(3) \\ f(4)\end{bmatrix} = \begin{bmatrix} f(1) \\ f(2) \\ f(3) \\ f(4)\end{bmatrix}.$$

First case: $f\neq 0$

We have that the vector $ \vec f = \begin{bmatrix} f(1) \\ f(2) \\ f(3) \\ f(4) \end{bmatrix} $ is a right eigenvector that corresponds to the eigenvalue $\lambda= 1$.

Since $P$ is an irreducible, stochastic matrix, we have by Perron - Frobenius theorem that eigenvalue $\lambda = 1$ is a simple one.

Thus, the respective eigenspace is of the form:

$V(\lambda = 1)= \left\{k\cdot \vec f \mid k \in \mathbb R\right\}=\langle \vec f\rangle,$ which means $\dim V(\lambda = 1) = 1$.

Also, we have to notice that $\vec e = \begin{bmatrix} 1\\1\\1\\1 \end{bmatrix} $ is always a right eigenvector, that corresponds to eigenvalue $\lambda = 1$. So, $V(\lambda = 1) = \langle \vec e\rangle$.

Thus, there must be a scalar $c \in \mathbb R^*$ s.t. $$\begin{bmatrix} f(1) \\ f(2) \\ f(3) \\ f(4) \end{bmatrix} = c \cdot \begin{bmatrix} 1\\1\\1\\1 \end{bmatrix},$$ (since $\vec e,\, \vec f$ are linearly dependent) which implies that $f(1) = \cdots = f(4) = c.$ Thus, $f$ is a constant function.

Second case: $f = 0$

Trivially $P\cdot f = f$.