Problem in understanding the meaning of concept of Big O notation.

57 Views Asked by At

Given a Normed space $X$ having norm $\| \cdot\|$ Further suppose given a sequence $x^{\delta}$ for $\delta > 0$. Now we say $x^{\delta} \to x \ \text{as} \ \delta \to 0$ with the rate $a > 0$ if we have $$\|x^{\delta} - x\| = O(\delta^a)$$ where $O(y)$ is BIG-O Notation. https://en.wikipedia.org/wiki/Big_O_notation

Now my question is i want to know which one is better rate? i.e. If $$\|x^{\delta} - x\| = O(\delta) \qquad \text{or} \ \qquad\|x^{\delta} - x\| = O(\delta^2)$$ then which one is the better rate of convergence?

1

There are 1 best solutions below

0
On BEST ANSWER

First: $x^\delta$ won't tend towards x for $\delta\rightarrow 0$. But disregarding that example, Big O notation is defined as:

$$f(x)\in O(g(x)) \quad (x\rightarrow x_0) :\Leftrightarrow \lim_{x\rightarrow x_0}\left|\frac{f(x)}{g(x)}\right|=c \in \mathbb{R}$$

If $x_0$ is not specified it is assumed to be infinity - but that convention might depend on the setting you are in.

Now your Question was: What is better? That depends greatly on $x_0$. If you have $x_0=\infty$, $m\ge n$ then: $$f(x)\in O(x^n) \Rightarrow c=\lim_{x\rightarrow \infty}\left|\frac{f(x)}{x^n}\right|\ge\lim_{x\rightarrow \infty}\left|\frac{f(x)}{x^m}\right|\Rightarrow f(x)\in O(x^m)$$ So you could say that $O(x^n) \subset O(x^m)$ for $x_0=\infty$ and $m>n$. Which means that it makes more sense to say $f(x) \in O(x^n)$ because it implies the other. Now if $x_0=0$ it is the other way around, as the less or equal sign was depending on $x^n\le x^m$.

Generally you want to use the order that is the smallest set in which f still is in.

Remark: While the Notation $f(x)=O(g(x)$ is more common $f(x)\in O(g(x)$ gives you the better intuition here. Because we are really talking about sets of functions.