Show that a random walk on graph $G$ is an irreducible Markov chain and determine its invariant distribution.

1k Views Asked by At

Consider an undirected connected graph $G=(V,E)$ with $|V|<\infty$. Let $(X_n)_{n\geq 0}$ be a random walk on $G$, i.e. it is a markov chain that at each time step moves to a neighbour in the graph chosen uniformly at random. That is, $(X_n)_{n\geq 0}$ has state space $V$ and transition probabilities.

\begin{equation*} p_{uv} = \begin{cases} \frac{1}{d(u)} & if (u.v)\in E\\ 0 &else \end{cases} \end{equation*} where $$d(u)= |\{v\in V : (u,v)\in E\}|$$ is the degree of $u$. Show that a random walk on $G$ is an irreducible Markov chain and determine its invariant distribution.

Because we have a connected graph, it is possible that we could walk on the entire graph, so we have an irreducible Markov-chain.

I'm stuck on how to find the invariant distribution. I know that the invariant distribution $\Pi$ is a probability vector such that $\Pi P=\Pi$ where $P$ is the transition matrix of $(X_n)_{n\geq 0}$.

I don't know how to do this since I don't know who is who's neighbor.

1

There are 1 best solutions below

1
On BEST ANSWER

If you do not know the graph, then you need to think about something general. One piece of useful information that you do have is the degree of each vertex. Intuitively, it sound plausible that the invariant distribution is related to the degree of each vertex. If you draw some simple graphs, then you can see that this is indeed the case and that the invariant distribution is given by the ratios of the degrees of the vertices and the sum of all degrees. I suggest you do this yourself. Starting small to get an idea is often a good solution when tackling mathematics problems you do not know the answer to.

The next step is to formalize the idea we have. Let $|V|$ be the number of vertices in the graph, numbered by $1,2,\dots,|V|$, and let $d(v_{i}$ be the degree of vertex $i$. Define $\Pi_{i}$ as the ratio between the degree of vertex $i$ and the sum of all degrees, so $\Pi_{i} = \frac{d(v_{i})}{\sum_{k=1}^{|V|} d(v_{k})}$. The probability of going from vertex $j$ to vertex $i$ is given by $P_{j, i} = \frac{1}{d(v_{j})}$ if $(v_{j}, v_{i}) \in E$ and 0 else. We then get that \begin{equation} (\Pi P)_{i} = \sum_{j = 1}^{|V|} \Pi_{j} P_{j, i} = \sum_{j : (v_{j}, v_{i}) \in E} \frac{d(v_{j})}{\sum_{k=1}^{|V|} d(v_{k})} \cdot \frac{1}{d(v_{j}} = \frac{d(v_{i})}{\sum_{k=1}^{|V|} d(v_{k})} = \Pi_{i} \end{equation} for $i = 1,2,\dots,|V|$. We can now conclude that $\Pi P = \Pi$.

I am not familiar with the exact definition of invariant distribution, but you might need to exclude graphs which result in an periodic Markov chain, for instance a graph with only two vertices.