Summing infinite series and relation to asymptotics (in computer science, graph theory etc(

49 Views Asked by At

I am quite new to theoretical computer science, but have studied graph theory before. Recently, I started thinking about why most asymptotic threshold functions and expressions of computational complexity etc are 'nice' functions of exponential, logarithms, and integer or fractional powers of n.

For instance, the threshold function for connectivity or Hamiltonicity in graph theory is $p^* (n) = \frac{log(n)}{n}$. [A (weak) threshold function $p^*(n)$ for a graph property $\mathcal{P}$) is defined as being such that, if $G_{n,p}$ is a random graph on n vertices with each edge inserted with a probability $p$, then as $n\rightarrow \infty$, the probability that $G_{n,p}$ is in $\mathcal{P}$ (has the property) tends to 0 if $\frac{p}{p^*} \rightarrow 0$ and tends to 1 is $\frac{p}{p^*} \rightarrow \infty$. The threshold for Hamiltonicity may be sharp in which case there are constants such that $\frac{p}{p^*} <c$ or $>C$ is enough for the two regimes of probability limits, but I have to check this).

The thing is, exponentials, integer/fractional powers of n, and logarithms keep cropping up, and I am trying to gain intuition as to why this is the case. I am yet to see a complexity going as transcendental or generally irrational powers of n, or something else.

I have tried to think about this in terms of power series expansions. Of course, this would only give you integer powers of n. Thinking about combinations and permutations of n objects may be some motivation as to why we generally only need integer powers of n.

But still why only expoenetials and logairthms? Why not some more exotic function with convergent power series, or something without a power series in integer powers of n?

Could it be that one can build up all power series using suitable combinations of logairthms, exponentials and polynomials?

1

There are 1 best solutions below

0
On BEST ANSWER

The reason why you haven't encountered irrational powers of $n$ is the following zero-one law (due to Shelah and Spencer, and quoted from Theorem 10.4.2 of Alon and Spencer's Probabilistic Method):

For any irrational $\alpha$, $0 < \alpha < 1$, setting $p = p(n) = n^{-\alpha}$, and for any first-order $A$, $$\lim_{n \to \infty} \Pr[G(n,p) \models A] = 0 \text{ or } 1.$$

Why is this bad news for thresholds? Well, it's generally going to be the case that if a property holds w.h.p. for $p \gg p^*$, and does not hold w.h.p. for $p \ll p^*$, then it's going to do something intermediate at $p^*$. Conversely, if the limit is $0$ or $1$ at $p$, then $p$ is not a threshold.

Of course, not all graph properties we care about are first-order. But many of them are, and many others happen at the same time as first-order properties (for example, the random graph becomes connected at the same time as it loses the last isolated vertex).

Any rational power of $n$ (in a reasonable range) is the threshold for some subgraph appearing in $G(n,p)$, so this tells the complete story for powers of $n$. (Key to the proof of the theorem above is the fact that with respect $p = n^{-\alpha}$, every subgraph is either "sparse" and has $\frac{v}{e} > \alpha$ or "dense" and has $\frac{v}{e} < \alpha$. For a nice enough subgraph, this tells us that there's either none of it or lots of it in $G(n,p)$, respectively.)


On top of this, we get logarithmic factors because of the Poisson distribution. In general, the default behavior to expect from a discrete random variable $X = X(n)$ as $n \to \infty$ is:

  • $\mathbb E[X] \to 0$ and so $X = 0$ w.h.p., or
  • $\mathbb E[X] \to c$ for some constant $c>0$ and $X$ converges to a $\text{Poisson}(c)$ distribution, or
  • $\mathbb E[X] \to \infty$ and $X \sim \mathbb E[X]$ w.h.p. (maybe with a normal distribution on the appropriate scale).

Of course, these don't always happen, but they're what usually happens.

Anyway, if $X \sim \text{Poisson}(c)$ in the limit, then $\Pr[X=0] \to e^{-c}$, which gives us exponential functions; when we want this probability to annihilate a power of $n$, we want $\mathbb E[X] \sim k \log n$ instead, which gives us logarithms.