Clarification on the big oh of the sum of two functions

298 Views Asked by At

In computing the asymptotic complexity of the sum of two functions, one theorem states that if $\large\lim_{n\rightarrow\infty}\frac{f_2(n)}{f_1(n)}$ exists, then the asymptotic complexity is $f_1(n)$. The author gave an example of adding $f_1(n)=n^3$ and $f_2(n)=n^2$. He determined that the limit exists and concluded that the complexity was $O(f_1(n))$ or $O(n^3)$.

Obviously the value of the functions matter because if we switched it around and made $f_1(n)=n^2$ and $f_2(n)=n^3$, then we couldn't compute the complexity that way as $\large\lim_{n\rightarrow\infty}\frac{n^3}{n^2} = \infty$ which does not exist.

Is this correct? If so, how do I make sure I am adding the right way?

2

There are 2 best solutions below

3
On BEST ANSWER

Your calculations are correct, but I think it's the phrasing that's confused, order doesn't matter here in quite the way you seem to think it does. (If I'm interpreting your statement correctly.) Carefully phrased, you've found:

Because $$\lim \frac{n^2}{n^3} = 0$$ we can say that $n^2$ is $O(n^3)$ (it's also $O(n^2)$). Because $$\lim\frac{n^3}{n^2} =\infty$$ we can say that $n^3$ is not $O(n^2)$.

I think maybe the example you were looking for is:

Since $$\lim\frac{n^3+n^2}{n^3} =1$$ we know that $n^3 +n^2 = n^2 +n^3$ is $O(n^3)$.

A major use of this (in a slightly different context $x\to 0$) is to make statements of the form: $$e^x = 1 + x + O(x^2)$$ Hope it helps.

0
On

This all derives from the fact that if $f_2 \subset f_1$, i.e. function of a lower order, we can always find some $n_0 >n$ and constant $a$ s.t. $f_1 + f_2 < f_1 +a f_1 = (1+a)f_1 = O(f_1)$