Asymptotic Problem Complexity = Infimum of Asymptotic Algorithm Complexities?

77 Views Asked by At

When talking about the complexity of algorithms and problems, the complexity of a problem is the infimum of the complexities of the algorithms that solve the problem.

My question is:

Is the asymptotic complexity of the problem always the infimum of the asymptotic complexities of the algorithms?

This is certainly not the case if we take infima of arbitrary functions, not only of those corresponding to algorithms solving a specific problem. For example, take

\begin{equation} f_i(n)=\begin{cases} 1 & n<i\text{,}\\ n & \text{else.} \end{cases} \end{equation}

We have \begin{equation} \Theta(\inf_i f_i) = \Theta(1)\text{,} \end{equation} but \begin{equation} \inf_i \Theta(f_i) = \inf_i\Theta(n)=\Theta(n)\text{.} \end{equation}

1

There are 1 best solutions below

4
On BEST ANSWER

The same problem that you've found in the abstract case can show up for actual problems.

Pick your favorite decision problem with really really bad asymptotic complexity: say, $\Theta(2^{2^n})$.

We're going to define a sequence of algorithms for this problem. The $k^{\text{th}}$ algorithm in this sequence will treat $k$-bit inputs as a special case.

Whatever the decision problem is, deciding it for all $k$-bit inputs is equivalent to computing some function $f:\{0,1\}^k \to \{0,1\}$. By a result of Shannon, there is a circuit of size $O(2^k/k)$ computing $f$; the algorithm will handle all $k$-bit inputs by just evaluating this circuit. This takes polynomial time in the size of the circuit, so we can bound the time by $C^k$ for some constant $C$ (the same constant $C$ works for all $k$). For all other input sizes, the algorithm will just default to whatever the $\Theta(2^{2^n})$ approach is.

All of these algorithms have asymptotic complexity $\Theta(2^{2^n})$. But the actual functions representing these complexities look like $$ f_i(n) = \begin{cases}C^n & n = k, \\ 2^{2^n} & \text{otherwise}\end{cases} $$ and so taking their pointwise infimum gives us a function which is $\Theta(C^n)$.


This is actually a really important feature of complexity theory: we need to distinguish the circuit complexity of a problem (the size of a circuit needed to solve it on $n$ bits, as a function of $n$) from the uniform circuit complexity (the size of a circuit needed, if there must be a Turing machine that outputs the $n^{\text{th}}$ circuit in $\operatorname{poly}(n)$ time). The construction above is exploiting the non-uniformity of the circuits Shannon's theorem gives us.

P.S. I guess we could replace the use of Shannon's theorem in this argument by a lookup table, but then we wouldn't get the nice connection to circuit complexity.