What's the difference in these two definitions of little-o (from CLRS vs. Wiley algorithms books)?

96 Views Asked by At

Algorithm Design: Foundations, Analysis, and Internet Examples by Goodrich and Tamassia (Wiley, 2002) defines little-o as follows:

$o(g(n)) = \{ f(n) : \forall c > 0,\ \exists n_0 > 0\ \text{such that } \forall n \geq n_0,\ 0 \leq f(n) \leq cg(n) \}$.

This definition differs slightly by strictness in the last inequality from that of Introduction to Algorithms by Thomas H. Cormen, Charles E. Leiserson, Ronald Rivest, and Clifford Stein (MIT, 2009) which defines as:

$o(g(n)) = \{ f(n) : \forall c > 0,\ \exists n_0 > 0\ \text{such that } \forall n \geq n_0,\ 0 \leq f(n) < cg(n) \}$.

The latter definition is what I primarily see in other sources, too. Are these equivalent due to the choice of any constant c? If not, which is standard? (I assume it is the CLRS one.)

1

There are 1 best solutions below

1
On BEST ANSWER

Neither one is standard. $$o(g)=\{f: \forall c>0\,\exists n_0\,\forall n>n_0\,(|f(n)|\le c|g(n)|)\}.$$