Relation Big-O and limit of a function?

709 Views Asked by At

It is not clear to me what it the relationship (if any) between Big-O and limit of a function. Suppose that a function $f(x)=O(g(x))$: does this imply that $f(x)$ converges to a limit as $x\rightarrow \infty$? And viceversa, suppose that $f(x)$ converges to a limit as $x\rightarrow \infty$: does this imply that there exists a function $g(x)$ s.t. $f(x)=O(g(x))$?

4

There are 4 best solutions below

0
On BEST ANSWER

When we say a function $f$ is $O(g)$ we are making a statement about its asymptotic behavior. This means that there exists some threshold $M\gt 0$ and some constant $N$ such that $x\gt M\implies |f(x)|\le N|g(x)|$.

Let's compare this to the definition of a limit.

$\lim_{x\to\infty}f(x) = L$ if for all $\epsilon\gt 0$ there exists $M\gt 0$ such that $x\gt M\implies |f(x)-L|\lt\epsilon$.

The two different definitions are very similar but not the same. With big $O$ notation we are simply saying that, for large $x$, the magnitude of $f$ is no larger than the magnitude of $g$. The limit is saying that $f$ is getting arbitrarily close to a constant.

To see if you understand this I leave you with two exercises.

  1. If $\lim_{x\to\infty}f(x) = L$, prove that $f$ is $O(1)$
  2. Prove that $f$ is $O(f)$
0
On

As a mathematics major, this question will have a more mathematical explanation. As a first resource, you should look at the wiki page. It is a great resource.

After this, I would like to say that, at least from a math perspective, $f(x) \neq O(g(x))$, generally. There is a great discussion in the wiki article on this.

What relates to limits in the expression "$f(x) = O(g(x))$"? Well, motivation for this starts when we would like to talk about the behavior of a function, $f$, as values of $x$ get arbitrarily large. Thus, it is often meant by $f(x) = O(g(x))$, that there exists an $M \in \mathbb{R}$ such that $$f(x) \leq M \cdot g(x)$$ for all $x \geq x_0$ for some $x_0 \in \mathbb{R}$. In particular, this is true when $x \rightarrow \infty$; the most straightforward examples are when $g$, $f$ are polynomials.

As a more accurate description of $f(x) = O(g(x))$, we should say $f(x) \in O(g(x))$, where $O(g(x))$ is a set. In particular, it is the set which describes a family of functions which have a "leading" term, $g(x)$: "thinking of $O(g(x))$ as the class of all functions $h(x)$ such that $|h(x)| ≤ C|g(x)|$ for some constant $C$". This makes much more sense and answers your last question.

In the context of computer science, big-O notation is probably used too liberally.

0
On

Big-O notation isn't really about limits as $x\to\infty$. It's about rate of growth, how quickly functions diverge (get big and stay big). In what follows, assume we're talking about functions that have nonnegative values, defined either on the nonnegative reals or integers.

$O$ is uninteresting for a function that has a limit as $x\to\infty$:

If $\lim_{x\to\infty}f(x) = L$ some real number $L$, then $f = O(1)$.

Thus if $f$ is such a function, it's one of the slowest growing, and least interesting, functions, asymptotically speaking. To see this, note that there's $x_0$ such that for all $x\ge x_0$, $f(x) < L + 1 = k\cdot 1$ where $k=L+1$.

So in a sense, $O$ notation is only interesting for functions that don't have limits as their arguments "approach infinity".

$O$ notation comes into its own as a means of comparing rates of growth, e.g. worst-case analyses for the running time of algorithms. To say that $f = O(g)$ means that eventually (for all $x\ge$ some $x_0$), $f$ is bounded above by a constant times $g$. $g$ is, to within a constant multiple, eventually an upper bound of $f$.

A few standard examples:

  • if $f$ is $O(\log_2 n)$ (meaning, the function $n\mapsto \log_2 n$) then $f$ grows fairly slowly;
  • if $f$ is $O(n^2)$, it's bounded above by a quadratic polynomial;
  • if $f$ is $O(2^n)$ then its bounded above by an exponential function, which gets big fast.

Whereas $\log_2 n$ is $O(n^2)$ (and $\log_2$ is also $O(2^n)$), the reverse is not true: $n^2$ outstrips $\log_2 n$, you can't bound it above by any constant times any log function. Similarly, $2^n$ grows faster than any polynomial, and faster than $\log_2$.

0
On

The other answers provide context, I'll just provide the condensed answers to your questions:

1. No, $f(x)\in\mathcal{O}(g(x))$ doesn't need to converge, take for example $g=f$: For every $f$ we have that $f(x)\in\mathcal{O}(f(x))$ (trivially), but if you choose a divergent $f$, well, it doesn't converge.

2. Yes, convergence implies $f(x)\in\mathcal{O}(g(x))$ since then $f$ is bounded for large $x$, i.e. $|f(x)|\leq M\cdot 1$ for all large enough $x$, so we can even choose $g(x)=1$ for all $x$ and deduce $f(x)\in\mathcal{O}(1)$.