I am being asked to order functions by growth rate and I didn't make much of this case until I started thinking about it more and more.
Let's postulate that we have some function
public void doStuff(int N) {
final int k = 1000000;
int m = k/N;
for (int i = 0; i < m; i++) {
// Do stuff
}
}
At first glance, it does seem to me that for larger and larger inputs of $N$, this function will execute faster and faster. However, in regards to growth of functions, are we not concerned with what happens when $$\lim_{N\to\infty} \frac{1}{N}$$
The method I posted above does execute faster for larger inputs of $N$, but only up to a certain point. After that point, the run time would be $0$.
With a function that grows at a rate of $\frac{1}{N}$, wouldn't its running time be limited to be just as fast as a constant time algorithm? It doesn't make sense to me that for large enough values of $N$, a function can have a run time of $0$ when it would still do something.
The code you gave has a constant running time, which is typically considered the fastest possible for an algorithm. The point is that just starting the algorithm and checking whether you want to terminate immediately takes, say, $c$ time units. The running time of an algorithm is then $c + f(n)$ for some function $f$. Hence, if $f$ is decreasing, the constant $c$ will dominate the running time asymptotically. In particular, you can never have runtime $0$.
On the other hand, if you only consider functions, you can of course have a function with growth $1/N$, and such function "grows" strictly slower than a constant function, i.e., $1/N \in o(1)$. You can check this using the formal definition of the little-o notation.