Given that $\lim_{x\to a} f(x) =c$ and $\lim_{x\to a} g(x) = \infty$, using only the precise definitions of limits and infinite limits, prove that $\lim_{x\to a} \frac{f(x)}{g(x)} =0$
For any $\varepsilon \gt 0$, there exist a $\delta_1 \gt 0 $ such that $0 \lt \lvert x-a \rvert \lt \delta_1 \Rightarrow \lvert f(x)-c \rvert \lt \varepsilon$.
For any $M \gt 0$, there exists a $\delta_2 \gt 0$ such that $0 \lt \lvert x-a \rvert \lt \delta_2 \Rightarrow g(x) \gt M$.
Given any $\varepsilon \gt 0$, choose $\delta = $
such that $0 \lt \lvert x-a \rvert \lt \delta \Rightarrow \lvert \frac{f(x)}{g(x)} \rvert \lt \varepsilon$
Since $\lvert x-a \rvert \lt \delta_2$,
$-\varepsilon \lt f(x)-c \lt \varepsilon\\ -\varepsilon + c<f(x) \lt \varepsilon +c\\ \lvert f(x)\rvert \lt \varepsilon +c$
I am unsure on how I am supposed to proceed on. Please advise.
For $\epsilon > 0$, choose $\delta_1 > 0$ such that $\lvert x-a\rvert < \delta_1$ implies $\lvert f(x)-c\rvert < \epsilon$ and $\delta_2 > 0$ such that $\lvert x-a\rvert < \delta_2$ implies $g(x) > \frac{\lvert c\rvert+\epsilon}{\epsilon}$. Then, for $\delta = \min(\delta_1, \delta_2)$, $\lvert x-a\rvert < \delta$ implies $$\left\lvert \frac{f(x)}{g(x)}\right\rvert = \frac{\lvert f(x)\rvert}{\lvert g(x)\rvert} < \frac{\epsilon}{\lvert c\rvert+\epsilon}\lvert f(x)\rvert\leq \frac{\epsilon}{\lvert c\rvert+\epsilon}(\lvert f(x)-c\rvert+\lvert c\rvert) < \frac{\epsilon}{\lvert c\rvert+\epsilon}(\lvert c\rvert+\epsilon) = \epsilon$$ where $\lvert f(x)\rvert\leq \lvert f(x)-c\rvert+\lvert c\rvert$ follows from the triangle inequality. Therefore, $\lim_{x\to a} \frac{f(x)}{g(x)} = 0$.