Why one defines the proper complexity functions?

498 Views Asked by At

Definition: A proper complexity function is a function $f$ mapping a natural number to a natural number such that:

  1. $f$ is nondecreasing

  2. There exists a $k$-string Turing machine $M$ such that on any input of length $n$, $M$ halts after $O(n + f(n))$ steps, uses $O(f(n))$ space, and outputs $f(n)$ consecutive blanks.


I don't understand why one should introduce such a type of functions to define time or space bounds. For example if I write an algorithm for a Turing machine, why can't I use a generic function $g:\mathbb N\longrightarrow \mathbb N$ to measure the amount of steps?

Ok, the condition $1.$ is clear, in fact the number of steps must increase but what about the condition $2.$?

1

There are 1 best solutions below

1
On BEST ANSWER

"Proper complexity function" is not, as far as I'm aware, standard terminology. In my experience the concept you define is more often called a constructible function, or "time-constructible".

Certainly there's nothing morally deficient about using non-constructible functions to measure resource usage of an algorithm you write (which the word "proper" might suggest to some).

However, there are some technical results where you start with a complexity class and get a problem or a program that relates in a particular way to that complexity class. Typical examples are the time and space hierarchy theorems. For these theorems it is necessary that the function you start out with is sufficiently well-behaved that the constructions in their proofs are possible, and the definition you quote nicely ensures that a "proper" function is "sufficiently well-behaved" for most of these hierarchy results.