Usefulness of asymptotic (e.g., Big O) notation embedded in expressions

132 Views Asked by At

Recently I've been working through The Probabilistic Method by Alon and Spencer, and I've noticed that almost everywhere results (specifically bounds) are given using asymptotic notation. Here are some examples:

On page 10 (Theorem 1.3.2), we bound a function $m(n)$ as follows: $$m(n) < (1 + o(1)) \frac{e \ln 2}{4} n^2 2^n$$

On page 26 (Theorem 2.5.1), we bound a function $R(n)$: $$R(n) \ge \left( \sqrt{\frac{2}{\pi}} + o(1) \right)n^{3/2}$$

To my understanding the embedded $o(1)$'s mean that, take the second example, $R(n)$ is greater than the expression on the right when you replace the $o(1)$ with a function of $n$, call it $g(n)$, where $g(n) = o(1)$. (I.e., $\lim_{n \rightarrow \infty} g(n) = 0$.) My question is: how is a bound such as this actually useful? To me, it seems like the $o(1)$ is a hole in our bound that prevents it from being useful. For example, if I want to find a lower bound for $R(5)$ using the expression, I'm not sure how I would actually do that since I can't just plug in $n = 5$.

My intuition goes even more haywire when you have something other than 1 in there, like in $$f(n) = n^2 + n + o(\ln n)$$

Or in one example in the book (Theorem 2.2.3), it says there exists a set of size $$m = \Theta((\ln n)^{1/(k-1)}$$

I don't see how this could possibly be helpful since, say $m =a \cdot \Theta((\ln n)^{1/(k-1)}$ for some $a$, the constant $a$ could be arbitrarily large! It is just that bounds such as these are useful for deriving other important (perhaps more advanced) results? Or can I actually plug in numbers somehow? I'm used to thinking in terms of strict equalities and "plug things in," so any intuition would be much appreciated!

1

There are 1 best solutions below

1
On BEST ANSWER

Think about the harmonic series, which can be approximated by $$H_n \approx \ln(n) + \gamma$$ Where $\gamma$ is the Euler Mascheroni constant. However, $$H_n > \ln(n) + \gamma$$ For all $n$. So if we want something that is always true we might say $$H_n = \ln(n) + \gamma + O\left(\frac{1}{2n}\right)$$ Which means we have turned our inequality into an equivalency by adding an error term that is at worst $\dfrac{1}{2n}$ which tells us more about the function we are analyzing since $$\lim_{n \rightarrow \infty} \frac{1}{2n} = 0$$ so we can say, $$\lim_{n \rightarrow \infty} H_n = \lim_{n \rightarrow \infty} \left(\ln(n) + \gamma\right).$$