Suppose that $\displaystyle\lim_{x\to a^-} f(x)=\infty$ and $\displaystyle\lim_{x\to a^+} f(x)=-\infty$. Using only the definitions of limit and infinite limit, prove that $$ \lim_{x\to a} \frac{1}{f(x)}=0. $$
I am having difficulties proving this statement using $\varepsilon$-$\delta$ definitions. Any help and guiding steps would be appreciated. Thank you!
Fix $\varepsilon > 0$.
By definition and the assumptions, you know that there exist (by choosing $A\stackrel{\rm def}{=} \frac{1}{\varepsilon}>0$) $\delta_+,\delta_->0$ such that respectively $$ f(x) > A \qquad \forall x \in (a-\delta_-,a)\tag{1} $$ $$ f(x) < -A \qquad \forall x \in (a,a+\delta_+) \tag{2} $$ (can you see why? This is true for all $A$, so in particular for any convenient one for us, the one above.) This in particular implies that $$ 0 < \frac{1}{f(x)} < \frac{1}{A} \qquad \forall x \in (a-\delta_-,a) \tag{3} $$ $$ 0 > \frac{1}{f(x)} > -\frac{1}{A} \qquad \forall x \in (a,a+\delta_+) \tag{4} $$ i.e. putting (3) and (4) together, $$ \left\lvert \frac{1}{f(x)}\right\rvert < \frac{1}{A} \qquad \forall x \in (a-\delta_-,a+\delta_+) \tag{5}. $$ Now, remember our choice of $A$, and consider $\delta\stackrel{\rm def}{=}\min(\delta_-,\delta_+)$.