Typical mixing in random walks on the random graph

85 Views Asked by At

I'm trying to understand two things in Theorem 1 in the Random Walks on the Random Graph journal paper. Theorem 1 is about mixing time of a simple random walk on a Erdos-Renyi graph (i.e. a random graph). Let $\mathcal{C}_1$ be the largest connected component of the graph. The theorem says that if the walk is started from a uniform distribution over $\mathcal{C}_1$, then the mixing time is of order $O(\log n)$. I have the following questions about this theorem.

  1. I do not understand what the stationary distribution (and the support of the stationary distribution) of the Markov chain used in Theorem 1 is? Since, the random walk is a simple random walk, I'm guessing the stationary distribution is a uniform distribution over the giant component of the Erdos-renyi random graph ($\mathcal{C}_1$). But this leads to my second question.
  2. Theorem 1 says $v_1$, i.e. the vertex at which the random walk starts, is chosen uniformly at random from $\mathcal{C}_1$. This means the random walk starts from the stationary distribution (assuming the stationary distribution is a uniform distribution over $\mathcal{C}_1$) and that doesn't make sense.

Any ideas? Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

In a simple random walk on a connected graph, the unique stationary distribution gives each vertex a probability proportional to its degree. (In other words, $\pi_v = \frac{\deg(v)}{2m}$, where $m$ is the number of edges.)

When the graph is not connected, there are many stationary distributions: we can weight the different connected components arbitrarily. If we start from the uniform distribution, then the probability that we're in a particular connected component is proportional to the number of vertices it has, and this will not change in a random walk. So the limiting distribution we expect is $\pi_v = \frac{\deg(v)}{2m_c} \cdot \frac{n_c}{n}$ for a vertex $v$ in a component with $m_c$ edges and $n_c$ vertices.

Awkwardly, when the graph (or a component of the graph) is bipartite and unbalanced, we don't expect convergence to this limiting distribution. For example, in a $3$-vertex path, we will start in a $(\frac13, \frac13, \frac13)$ distribution, and alternate between this distribution and the $(\frac16, \frac23, \frac16)$ distribution forever.


In the Erdős–Rényi random graph with $p = \frac{\lambda}{n}$, the average degree is constant, but the degree distribution is Poisson with mean $\lambda$. In the giant component, there's a slightly different distribution - notably, we've left out all the degree-$0$ vertices. But there is still something nontrivial to converge to. (Many of the small components are trees, for which we don't always expect convergence to happen, much less fast mixing - this explains why we want to focus on the giant component specifically.)