Why is the little o(h) in the Poisson Process dissimilar to other field's?

985 Views Asked by At

I learned about little oh notation in Algorithm class last year.

$$ \mbox{if } \lim_{n\to\infty}\frac{f(n)}{g(n)}=0 \mbox{, then } f(n)\in o\big(g(n)\big) \mbox{ or } f(n)=o\big(g(n)\big) \mbox{.} $$


However, in the Poisson process definition, \begin{align} &(1)~N(0)=0\\ &(2)~\mbox{has Independent increments and Stationary increments}\\ &(3)~P\big[\mbox{one event happens between time } t\in(t, t+\Delta{t}\big] = \lambda\Delta{t} + o(\Delta{t})\\ &(4)~P\big[\mbox{two or more events happen between time } t\in(t, t+\Delta{t}\big] = o(\Delta{t}),\\ \end{align} where \begin{align} &N(t) \mbox{ is the number of events occured between time } t\in(0, t]\\ &o(\Delta{t}) = \left\{ f(\Delta{t}): \lim_{\Delta{t}\to0}\frac{f(\Delta{t})}{\Delta{t}}=0\right\} \end{align}


Why is the little-oh notation used differently?

That is, while little-oh notation is used in Algorithm field like $\displaystyle\lim_{n\to\infty}$, it is used in Stochastic process field $\displaystyle\lim_{n\to0}$.

Thank you for reading my question.

1

There are 1 best solutions below

3
On BEST ANSWER

It's mostly a context thing - big- and little-oh notation are best at explaining behaviour at large and small scales. Typically for algorithm and complexity applications, you're looking at the behaviour of something on large sets, and this leads to the $n\to\infty$ definition given early in your post.

However, much like defining limits of functions, you can define them as $x\to\infty$ just as well as $x\to a$ for constants $a$. For example, $f\in o(g(x))$ as $x\to a$ says that $\lim_{x\to a}\frac{f(x)}{g(x)}=0$.

This can be used in

  • Differentiation: $f(x)=f(a)+f'(a)(x-a)+o(x-a)$ tells us that the best linear approximation to $f(x)$ has slope $f'(a)$

    • Taylor Series: You'll often see things written like: "$\sin(x)=x-\frac{x^3}{6}+O(x^5)$ as $x\to0$"

On the other hand, you'll see non-local use of the notation elsewhere in stochastic processes. For example the law of the iterated logarithm tells us that a random walk (under suitable assumptions) satisfies $S_n=O(n\log\log n)$ as $n\to\infty$.

People use this notation to control error terms, and whether it's a case of $n\to0,\infty,$ or $n\to$ constant depends on where you're trying to bound errors.

In the case of the Poisson process, it's about showing continuity (sort of), which is a local property, so it makes sense to consider $\Delta t\to0$, whereas in algorithms it tends to be about long-term feasibility, which corresponds to taking larger and larger $n$.