Bernoulli random graph as a special case of exponential random graph

439 Views Asked by At

Consider a Bernoulli random graph where we have $n$ vertices and we fill in each edge independently with probability $p$. Let $X$ denote a random variable which uniformly draws one such Bernoulli graph. The value of $X$ is an adjacency matrix.

How can we express $\operatorname{Pr}\left[ X= x \right]$ in the form $\frac{1}{|\mathcal X|} e^{\theta s(x)}$ where $s(x) = \sum_{i<j} x_{ij}$ is the number of edges in $x$?

If $p=1/2$ then it's clear that $\operatorname{Pr}\left[ X= x \right] = \frac{1}{|\mathcal X|}$ so we must choose $\theta=0$. I am struggling to determine what it should look like for other values of $p$. I have a hunch that it'll be something like $\log \frac{1-p}{p}$ but I can't prove it.

Any help would be appreciated.

1

There are 1 best solutions below

4
On BEST ANSWER

In general, the probability of getting a specific random graph $x$ from the $G(n,p)$ model is $p^{s(x)}(1-p)^{\binom n2 - s(x)}$. There are $s(x)$ edges we want to be present, each of which is there with probability $p$. There are $\binom n2 - s(x)$ edges we don't want, and each of them is absent with probability $1-p$.

We can write this as $(1-p)^{\binom n2}e^{s(x) \log \frac{p}{1-p}}$.

This has the form $C e^{-\theta s(x)}$, but $C$ is not $\frac1{|\mathcal X|}$, which should not be surprising; $C$ is determined entirely by the requirement that all the probabilities should add up to $1$. There are $\binom Ns$ graphs with $s$ edges, where $N = \binom n2$; therefore the probabilities of this form add up to $$ \sum_{s=0}^N \binom Ns C e^{\theta s} = C (1 + e^{\theta})^N $$ and so $C$ will always have to be $(1 + e^{\theta})^{-\binom n2}$ to make the sum equal to $1$.