Does a function with a growth rate of $\frac{1}{N}$ still grow in constant time?

706 Views Asked by At

I am being asked to order functions by growth rate and I didn't make much of this case until I started thinking about it more and more.

Let's postulate that we have some function

public void doStuff(int N) {
    final int k = 1000000;
    int m = k/N;  

    for (int i = 0; i < m; i++) {
        // Do stuff
    }
}

At first glance, it does seem to me that for larger and larger inputs of $N$, this function will execute faster and faster. However, in regards to growth of functions, are we not concerned with what happens when $$\lim_{N\to\infty} \frac{1}{N}$$

The method I posted above does execute faster for larger inputs of $N$, but only up to a certain point. After that point, the run time would be $0$.

With a function that grows at a rate of $\frac{1}{N}$, wouldn't its running time be limited to be just as fast as a constant time algorithm? It doesn't make sense to me that for large enough values of $N$, a function can have a run time of $0$ when it would still do something.

2

There are 2 best solutions below

0
On BEST ANSWER

The code you gave has a constant running time, which is typically considered the fastest possible for an algorithm. The point is that just starting the algorithm and checking whether you want to terminate immediately takes, say, $c$ time units. The running time of an algorithm is then $c + f(n)$ for some function $f$. Hence, if $f$ is decreasing, the constant $c$ will dominate the running time asymptotically. In particular, you can never have runtime $0$.

On the other hand, if you only consider functions, you can of course have a function with growth $1/N$, and such function "grows" strictly slower than a constant function, i.e., $1/N \in o(1)$. You can check this using the formal definition of the little-o notation.

4
On

First remark: when $N > k$, your function isn't doing "real" stuff any more: your loop starts for $i = 0$ and stops when $i \ge m = 0$. So, you never enter the loop.

Second, if you're going to rank using big-Oh complexity, then you can actually use the notation $O(\frac 1N)$. Especially, $O(\frac 1N)$ is a $O(1)$. It's unusual, as most algorithms take actually longer to execute for high N, but it's not mandatory. Let's see why in the following random example.

Assume I'm an evil hacker of unknown nationality who wants to hack US presidential elections: my goal is to make the number of votes for the orange candidate high enough for him to win the state. If the other candidate collected $1\ 000\ 000$ votes in a state then my orange candidate will need as much plus one.

  • If he only has $N = 10$ votes. Then I have to forge $999\ 991$ votes, it's quite long.
  • On the other hand, if the orange candidates has already $N = 10\ 000\ 000$ votes in this state, I don't need to do anything: my algorithm is faster for high $N$.