Apologies if this question is too trivial, however in my Asymptotics and Perturbation methods lectures, my professor has been using the notations "o(x),O(x)" and I am really struggling to understand their definitions.
I have the following questions:
Determine the order of the following expressions as $\epsilon \rightarrow0$:
i) $\sqrt{\epsilon(1-\epsilon)} $
ii)$4\pi^2\epsilon$
iii) $\int^{\epsilon}_{0} exp(-s^2) ds$
To me this looks like everything should have order $0$ as $\epsilon \rightarrow 0$ but ofcourse this seems silly... in which case any explaination on the use of this symbol or perhaps examples would be greatly appreciated.
EDIT:
Definition 1: If for any $\epsilon > 0 $ there exists $\delta>0$ such that $$\lvert{f(n)}\rvert < \epsilon\lvert g(n) \rvert$$ when $0<\lvert x-x_0\rvert < \delta$ then $f(x) = o(g(x))$. Meaning f is smaller than g.
Use: $f(x) = o(g(x)) \implies \frac{f(x)}{g(x)} \rightarrow 0$ as $x \rightarrow x_0$.
Definition 2: If there exists positive constants $K$ and $\delta$ such that $$ \lvert f(x) \rvert \leq K\lvert g(x) \rvert$$ when $0<\lvert x-x_0\rvert < \delta$ then $$f(x) = O(g(x)).$$
Attepmpt at i):
I need to show that $\lvert\sqrt{\epsilon(1-\epsilon)}\rvert \leq K \lvert {\epsilon}\rvert.$
Thus,
$\epsilon(1-\epsilon) \leq K^2\epsilon^2$
$(1-\epsilon) \leq K^2\epsilon$
$\frac{1-\epsilon}{\epsilon} \leq K^2$
Im not 100% sure what I conclude from this one, my gut instinct says we can then say it is equal to $O(\epsilon)$
Attempt at ii):
Given $4\pi^2\epsilon$ I belive this could be equal to $O(\epsilon)$ as the positive constants would be $4\pi^2$ as given.
Limited attempt at iii)
For this equation I figured I should first compute the integral afterwhich I find it equal to:
$$ \frac{\sqrt{\pi}}{2} erf(\epsilon).$$ Not entirely sure what to do with this now, my brief understanding of the error function is that it is an infinite sum so neither definition could hold? (I feel that this is incorrect however)
They all approach $0$ as $\epsilon\to 0$, but the question is: how fast do they approach $0$? For example, $\epsilon^2$ approaches $0$ faster than $\epsilon$ does, because the ratio $\frac{\epsilon^2}{\epsilon}$ also goes to $0$, whereas $2\epsilon$ goes to zero at essentially the same speed as $\epsilon$, because the ratio doesn't approach $0$ or $\infty$.
We say $\epsilon^2=o(\epsilon)$ to mean that the ratio goes to $0$, and consequently that $\epsilon^2$ goes to $0$ faster than $\epsilon$. The notation $2\epsilon=O(\epsilon)$ means that $2\epsilon$ goes to zero at the same speed or faster: the ratio is bounded.
So here the first question is to compare the functions given to $\epsilon$, and say whether the ratio goes to $0$ ($f(\epsilon)$ goes to $0$ faster), goes to $\infty$ ($f(\epsilon)$ goes to $0$ slower) or is bounded away from $0$ and $\infty$ ($f(\epsilon)$ goes to $0$ at the same speed). For functions which aren't the same speed as $\epsilon$, then you might investigate whether there is some value $a$ for which they have the same speed as $\epsilon^a$.