big O, small o and op notations

2.2k Views Asked by At

I got really confused with these three notations and their relations.

$O(n), o(n)$ and $o_p(n)$. And also if a relation like this $sup|c_n(t)-c(t)|\rightarrow0$ in P. Can we use any of the three notations mentioned above to represent it?

Thanks a lot.

1

There are 1 best solutions below

1
On

Informally, $f(n) = O(g(n))$ means that "$f$ is eventually bounded from above by (a properly scaled version of) $g$."

So, $f(n)=2n$ is not bounded from above by $g(n)=n$, but it is bounded by $3g(n)$. Therefore $2n = O(n)$.

Also, $n + 3$ is not bounded from above by $n$, but for $n>3$, it is bounded by $2n$. Therefore $n+3 = O(n)$ too.

It's quite possible for distinct $f$ and $g$ to be such that $f = O(g)$ and, at the same time $g = O(f)$. For example, $n = O(n+3)$. (We saw that $n=3 = O(n)$.) This is not always true, though. For example, $2n = O(n^2)$, but not the other way around.

The little-oh notation is used to make a stronger claim. If $f(n) = o(g(n))$, then

$$\lim_{n \to \infty} \frac{f(n)}{g(n)} = 0 \enspace.$$

If $f(n) =o(g(n))$, it is not the case that $g(n) = o(f(n))$. In addition, if $f = o(g)$, then also $f = O(g)$, but not the other way around. (This is what it means for little-oh to make a stronger statement.)


Another way to appreciate the difference between little and big oh is to note that $f = O(1)$ means that $f$ is eventually bounded by a constant, while $f = o(1)$ means that $f$ eventually goes to $0$. For instance,

$$ f(n) = \frac{2n}{n+2} $$

is $O(1)$, but is not $o(1)$, whereas

$$ f(n) = \frac{2n}{(n+2)^2} $$

is both $O(1)$ and $o(1)$.


In the formal definition of $f(n) = O(g(n))$, we make use of two constants, $k$ and $n_0$. We say that $f(n) = O(g(n))$ if there exist positive $k$ and $n_0$ such that for $n \geq n_0$, $f(n) \leq k g(n)$.

Now suppose that we consider a claim like, "if $f(n)$ is a polynomial in $n$ and $g(n)$ is an exponential $b^n$ with $b>1$, then $f(n) = O(g(n)$." This claim is true, but the constants we use depend on the degree of the polynomial and the base $b$ of the exponential. That is, we cannot find $n_0$ and $k$ that work for all possible polynomials and all values of $b$. (If we fix $b$, $k$, and $n_0$, we can always find a polynomial $f$ such that $f(n_0+1) > b^{n_0+1}$. On the other hand, $b^n$ will eventually overtake even this $f$.)

We may stress this fact by writing $f(n) = O_{k,n_0}(g(n))$. Note that we are making a claim about parametrized families of functions.