The paper "Small-Worlds: Strong Clustering in Wireless Networks" (http://arxiv.org/pdf/0706.1063.pdf) is indicating empirically that the average node degree $<k>$ is (or can be approximated by) $\frac{\Pi r^2n}{l^2}$ whereby two nodes are connected if they are within range $r$, $n$ is number of nodes, $l$ length of the square.
I am trying to prove this relationship but I couldn't get anywhere. Any idea how to prove or disprove it? Some simulations I did should that it gets close to average node degree k if the length $l$ is considerably larger than $r$. This makes sense, since the border influence is diminished for these cases. At the border a node does have a limited number of neighbors only. 1[See figure as example]
To avoid border influences, you can consider the geometric random graph (GRR) on a torus (which amounts in changing slightly the distance function).
In a GRR, the average degree is the number of vertices at distance less than $r$, i.e. the number of vertices that fall within a circle of radius $r$. Then consider that there are $n-1$ vertices that, uniformly distributed, have a probability $\pi r^2 / l^2$ (area of the circle over the total area) to be neighbors.