So for some time I am trying to figure out big $O$ and small $o$ notation and I understand the main idea behind those notations but when I try to dig a bit deeper it seems I can't figure some stuff, anyways my problems are: In some of my textbook, the limit definition of big O is: $$\lim_{x \to a}\frac{f(x)}{g(x)}=b$$ where $a\in \mathbb{R}$ $\cup\{-\infty,+\infty\}$
I also know there is this definition of big O $|f(x)|\leq C|g(x)| $ where C is some positive constant. So here are my questions
- In some textbooks it says that the value of the limit definition should be $0<b<\infty$. So can the value of the limit be any real number or does it has to be strictly positive(i know it can be zero also but they often don't specify it) and why is that?.
- I guess this question relates to to the first one. In other definition of big $O$, why $C$ has to be positive constant and why $f(x)$ and $g(x)$ have to be positive. Why they must be represented in terms of absolute value?
When I am thinking about big O, I often think about it in terms of a graphs. Lets say we have some function $h(x)$ whos limit value approaches lets say $-2$ when x goes for example to infinity. Why can't we say for example that $h(x) = O(-1)$? I understand that this holds by the above definition but I don't understand the bigger picture of why is it like that, which leeds me to third question
When observing big $O$ on graph, do we only look how funtions are bounded by each other from the positive upper part of the graph where $y$ value is positive and if so, why is it like that?
I guess the "limit definition" of big $O$ is not a real definition but rather to show there must be some constant ratio of change between those two functions. Am I right about that? If so, is there some example why "limit definition" of big $O$ is not equivalent with the $|f(x)|\leq C |g(x)|$
In a nutshell, I see all my question are related to each other and similar but I tried to categorize it a bit. I hope I was clear about my confusions.
The "limit definition" is just wrong. Take $f(x) = \sin x$ and $g(x) = \sin x + 10$, then $f = O(g)$ as $x \to \infty$, but $\lim\limits_{x \to \infty} \frac{f(x)}{g(x)}$ doesn't exist. And there need not be any "constant ratio" - $f$ can be sometimes significantly smaller than $g$.
Correct definition is as follow: $f(x) = O(g(x))$ as $x \to a$, if there exists constant $C$ s.t. for all $x$ close to $a$, $|f(x)| < C \cdot |g(x)|$. If we say "exists constant $C > 0$" - we get equivalent definition, so some sources can include this requirement, while others can omit.
If some source (usually related to algorithms) is interested only in positive functions, they can introduce this definition only for functions $f$ and $g$ that take non-negative value, and drop absolute value in definition. This is equivalent to original definition for such functions.
When we look at graph, we are interested in both positive and negative values, $f = O(g)$ is equivalent to $|f| = O(g)$ or to $f = O(|g|)$.
And yes, if we have say $h(x) = 2 + x$, and $g(x) = -2$ then we totally can say that $h(x) = O(g(x))$ as $x \to 0$. This is not very common, because usually when we speak about asymptotics - $h = O(g)$ - this is because we want to replace $h$ with something simpler, and in this case it makes sense to choose some $g$ that is easier to work with.