There is a formal definition for the Big O notation in Wikipedia. Up to now I have come across Big O in Numerical Analysis, Calculus and Algorithms which all are pretty distinct fields. What I am wondering is if that definition is global and is the only one that is used in every field where the Big O is involved or there are other ways that the Big O is defined in other areas of Mathematics.
2026-05-14 03:45:09.1778730309
Big O Definition
425 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
The most general definition on Wikipedia, which is that of saying $$ f\in O(g)\text{ as }x\rightarrow a $$ whenever $$ \limsup_{x\rightarrow a}\left|\frac{f(x)}{g(x)}\right|<\infty $$ is the standard one used in all fields.
The relevant case in computer science is $a=\infty$. Moreover, by convention, $x$ usually takes integer values, and is hence written $n$. For example, $$ 2n^{2}+3n+1\in O(n^{2})\text{ as }n\rightarrow\infty. $$ Since it is understood that $a=\infty$, one usually does not specify $n\rightarrow\infty$ in the notation.
In numerical analysis, one usually cares about approximating a function at a point $x$ using the information encoded by the smoothness of the function at a point $a$. For example, $e^{x}$ is approximately $1+x$ as $x\rightarrow0$: $$ e^{x}-(1+x)\in O(x^{2})\text{ as }x\rightarrow0. $$