Markov chain on a graph

86 Views Asked by At

Suppose $(X_0, X_1, X_2,...)$ is a Markov chain on a graph $G=(V,E)$ on which there is a distinguished node $v^* \in V$. The transition matrix $P$ is such that $P(x,y)>0$ iff $x$ and $y$ are neighbours, i.e. if $x \sim y$ and we also assume that for every $x\in V$, $x \sim x$ and consequently $P(x,x)>0$. Moreover, for a node $x\in V$, $P(x,y) = p >1/2$ for the node $y$ such that $d(y,v^*)<d(x,v^*)$ (or one of those nodes, if there are multiple) and $P(x,z) \sim 1-p$ for all other neighbors of $x$. How can we prove that the invariant measure $\pi$ puts maximum mass on $v^*$, i.e. that $\pi(v^*)>\pi(v) \forall v\in V\setminus\{v^*\}$ ?

1

There are 1 best solutions below

1
On BEST ANSWER

The right way to approach this is to prove the apparently stronger statement that $v^*$ is given more mass than all vertices at distance $d$ from $v^*$ combined, for every $d$. (In fact it is equivalent, since if there is a counterexample to this statement we can contract all vertices at distance $d$ to a single vertex and get a counterexample to the original statement.)

Maybe this is enough of a hint; if not there is a spoilered explanation below.

Let $q_d$ be the total mass of all vertices at distance $d$ in the invariant measure. Clearly $q_0\geq pq_0+pq_1$, since if you are at distance $0$ or $1$ you move to distance $0$ with probability at least $p$. Since $p>1-p$, this implies $q_0>q_1$.
Similarly $q_0+q_1\geq q_0+pq_1+pq_2$, since if you are at distance $0$ you can only move to distance $0$ or $1$, and if you are at distance $1$ or $2$ you have at least probability $p$ of moving to distance $0$ or $1$. This implies $q_1>q_2$. Continuing in this manner, $q_d$ is decreasing with $d$ so is maximised at $q_0=\pi(v^*)$.