Calculus: Proving $\lim_a \frac{1}{f}=0$ from $\lim_{a^+} f= -\infty$ and $\lim_{a^-} f= \infty$

95 Views Asked by At

Suppose that $\displaystyle\lim_{x\to a^-} f(x)=\infty$ and $\displaystyle\lim_{x\to a^+} f(x)=-\infty$. Using only the definitions of limit and infinite limit, prove that $$ \lim_{x\to a} \frac{1}{f(x)}=0. $$

I am having difficulties proving this statement using $\varepsilon$-$\delta$ definitions. Any help and guiding steps would be appreciated. Thank you!

2

There are 2 best solutions below

0
On BEST ANSWER

Fix $\varepsilon > 0$.

By definition and the assumptions, you know that there exist (by choosing $A\stackrel{\rm def}{=} \frac{1}{\varepsilon}>0$) $\delta_+,\delta_->0$ such that respectively $$ f(x) > A \qquad \forall x \in (a-\delta_-,a)\tag{1} $$ $$ f(x) < -A \qquad \forall x \in (a,a+\delta_+) \tag{2} $$ (can you see why? This is true for all $A$, so in particular for any convenient one for us, the one above.) This in particular implies that $$ 0 < \frac{1}{f(x)} < \frac{1}{A} \qquad \forall x \in (a-\delta_-,a) \tag{3} $$ $$ 0 > \frac{1}{f(x)} > -\frac{1}{A} \qquad \forall x \in (a,a+\delta_+) \tag{4} $$ i.e. putting (3) and (4) together, $$ \left\lvert \frac{1}{f(x)}\right\rvert < \frac{1}{A} \qquad \forall x \in (a-\delta_-,a+\delta_+) \tag{5}. $$ Now, remember our choice of $A$, and consider $\delta\stackrel{\rm def}{=}\min(\delta_-,\delta_+)$.

0
On

Let $\varepsilon>0$; by assumption, there are $\delta_1>0$ and $\delta_2>0$ such that,

  1. for $a-\delta_1<x<a$, $f(x)>1/\varepsilon$,

  2. for $a<x<a+\delta_2$, $f(x)<-1/\varepsilon$.

Set $\delta=\min(\delta_1,\delta_2)$ and suppose $0<|x-a|<\delta$. Try proving that $$ \left|\frac{1}{f(x)}-0\right|<\varepsilon $$ which will end the proof.