I have a problem with an exercise from M. Holmes "Introduction to perturbation methods"
I'll write the definitions.
Def: Let $f=f(\epsilon),\quad g=g(\epsilon)$. We say $f$ is a "big Oh" of $g$ and write
\begin{equation} f=O(g)\quad as \quad \epsilon \downarrow \epsilon_0 \end{equation}
If there are constants $k_0,\epsilon_1$ (independet of $\epsilon$) such that
\begin{equation} |f|\leq k_0|g| \quad for \quad \epsilon_0<\epsilon<\epsilon_1. \end{equation}
We say $f$ is a "little oh" of $g$ and write
\begin{equation} f=o(g)\quad as \quad \epsilon \downarrow \epsilon_0 \end{equation}
If for all $\delta>0$ there exists $\epsilon_2$ (independet of $\epsilon$) such that
\begin{equation} |f|\leq \delta|g| \quad for \quad \epsilon_0<\epsilon<\epsilon_2. \end{equation}
now the exercise is the following:
1.2. In this problem is assumed that $\epsilon \downarrow 0$ (meaning $\epsilon_0=0$)
- Show that if $f=O(\epsilon^\alpha)$ then $f=o(\epsilon^\beta)$ for any $\beta < \alpha$.
- Blah blah not important now.
There is a theorem that states the following:
Theorem1.3:
- If \begin{equation} \lim_{\epsilon\rightarrow \epsilon_0}\frac{f}{g}=L \quad -\infty<L<\infty, \end{equation}
then $f=O(g)$.
- If \begin{equation} \lim_{\epsilon\rightarrow \epsilon_0}\frac{f}{g}=0, \end{equation}
then $f=o(g)$. But I dont think the theorem applies here. Thanks in advance