Rate of eigenvalue decay for Erdős–Rényi random graph

420 Views Asked by At

How fast do eigenvalues decay in a large Bernoulli random graph for a fixed large $n$ (number of nodes) and fixed $p$ (probability of dropping an edge)? Also known as Erdős–Rényi random graph. Empirically, if we sort eigenvalue in decreasing order, their magnitudes seem to decay exponentially with $i$=position in the sorted list. Can someone point me to a reference or a way to justify this?

enter image description here

This graph plots log of absolute value of eigenvalues of a large Erdős–Rényi random graph, arranged in decreasing order.

ListLogPlot@Rest@Reverse@Sort@Abs@Eigenvalues@N@AdjacencyMatrix@RandomGraph@BernoulliGraphDistribution[2000, 0.5]

Edit eigenvalues follow the shape of semicircle CDF enter image description here

notebook

2

There are 2 best solutions below

6
On BEST ANSWER

As $n \to \infty$ (with constant $p$) the eigenvalues of $G_{n,p}$ are essentially described by Wigner's semicircle law.

For this purpose, let $(Y_n)_{n=1}^\infty$ be a sequence of random $n \times n$ symmetric matrices, where

  • all off-diagonal entries of all matrices follow the same distribution with mean $0$ and variance $1$;
  • all diagonal entries of all matrices follow the same distribution with mean $0$ and finite variance.

Let $X_n = Y_n/\sqrt n$. As $n \to \infty$, the eigenvalues of $X_n$ converge to a distribution with PDF $f(t) = \frac1{2\pi}\sqrt{4-t^2}$, in the sense that the number of eigenvalues of $X_n$ in the range $[a,b] \subseteq [-2,2]$ converges to $\displaystyle\int_a^b f(t)\,dt$.


For $G_{n,p}$, we almost satisfy the hypotheses of the theorem, except that the distribution of the off-diagonal entries is Bernoulli, which does not have mean $0$. This only affects the leading eigenvalue, which is $\approx np$; the other eigenvalues follow a semicircle law.

Here is an example with $n=2000$ and $p = \frac12$, where I plot a histogram of the actual eigenvalues against the theoretical semicircle (centered at $(0,0)$ with radius $\sqrt{2000}$, then normalized to integrate to $2000$). You don't see the leading eigenvalue of $\approx 1000$ here.

enter image description here

Mathematica code:

Show[
   Histogram[Eigenvalues@N@AdjacencyMatrix@
      RandomGraph@BernoulliGraphDistribution[2000, 0.5], {1}], 
   Plot[Sqrt[2000 - x^2]/(1000 Pi)*2000, {x, -45, 45}]]
1
On

Edit: This answer currently does not address the underlying question, because it mistook the graph for a histogram and and was answering the unasked question about the distribution of large eigenvalues -- e.g. $p(\lambda)$ for large $\lambda$, and pointing out that in fact $p(\lambda) = 0$ for large enough $\lambda$, much narrower of course than an exponential distribution.

It's a basic fact that the maximum eigenvalue of an adjacency matrix is bounded above by the maximum degree of any vertex in the graph -- are you sure your plot does not show the distribution for $n=2000$? You can find a proof of this on Wikipedia. Here's a simple proof of the weaker property that the eigenvalue's magnitudes are bounded by the number of vertices in the graph. The sum of squared eigenvalues $\sum_i \left| \lambda_i \right|^2$ is the trace of the square of the adjacency matrix $A^2$. For one, this implies a hard upper bound on the magnitude of any eigenvalue. Since the elements of the (symmetric) adjacency matrix are either 0 or 1, an element $(A^2)_{ii}$ on the diagonal of the squared adjacency matrix is $$ (A^2)_{ii} = \sum_{j} A_{ij} A_{ji} = \sum_{j} A_{ij}^2 = \sum_{j} A_{ij} = d(i). $$ Since the degree of the $i$-th vertex is absolutely bounded by $n-1$, where $n$ is the number of vertices in the graph, the upper bound on any eigenvalue's magnitude is $$ \left| \lambda_i \right|^2 \leq \sum_{i} \left| \lambda_i^2 \right| = \mathrm{tr} \left( A^2 \right) = \sum_{i} (A^2)_{ii}\leq n(n-1). $$

When the edges to the graph are added randomly, the degrees of separate vertices are of course correlated. Assuming as an approximation they are independent, then each vertice's degree is $p(n-1)$ on average and $$ E \left[ \sum_{i} \left| \lambda_i^2 \right| \right] \approx pn(n-1). $$