Suppose $\lim_{x \to a} f(x)=\infty$ and $\lim_{x \to a} g(x)=c$ where $c \in R$.Prove that $\lim_{x \to a} (f(x) + g(x)) = \infty$

68 Views Asked by At

Suppose $\lim_{x \to a} f(x)=\infty$ and $\lim_{x \to a} g(x)=c$ where $c \in \Bbb R$. Prove that $\lim_{x \to a} (f(x) + g(x)) = \infty$

I'm trying to prove it , and I kinda of know how to do it, but I'm having trouble with writing it formally and deriving the conclusion properly.

My proof;

From the giving definitions, we know that $$(\forall M_1 > 0)(\exists \delta_1 > 0 : [0<|x-a|<\delta_1] \rightarrow [f(x)> M_1])$$ and $$(\forall M_2 > 0)(\exists \delta_2 > 0 : [0<|x-a| \leq \delta_2] \rightarrow [|g(x)-c|< M_2])$$ , and we want to prove that $$(\forall M_3 > 0)(\exists \delta_3 > 0 : [0<|x-a|<\delta_3] \rightarrow [f(x)+g(x)> M_3])$$


After that I thought I can say that $\forall M$ there is some $\delta$ for satisfying both conditions, but I have not idea how to proceed form that, and I don't have different idea.

So how can I proceed form that ? Any help would be appreciated.

2

There are 2 best solutions below

1
On

Note that your limit definition for $g(x)$ is incorrect.

A possible approach. For all $M_3$ there exists a $\delta_1$ such that $|x-a|<\delta_1$ implies $f(x)>M_3 + 1/M_3 - c $. Similarly, for all $M_3$ there exists a $\delta_2$ such that $|x-a|<\delta_2$ implies $|g(x)-c|< 1/M_3$. Therefore, by choosing $\delta = \min\{\delta_1,\delta_2\}$ we get $$ f(x) + g(x) > M_3 + \frac{1}{M_3} - c + g(x) > M_3 + \frac{1}{M_3} - c + c- \frac{1}{M_3} = M_3, $$ for $|x-a|<\delta$.

1
On

Since $$ |f(x)+g(x)| = |f(x)| \left| 1 + \frac{g(x)}{f(x)} \right|, $$ by assumption the fraction $g(x)/f(x)$ converges to zero as $x \to a$. Hence $$ \left| 1 + \frac{g(x)}{f(x)} \right| \geq \frac{1}{2} $$ for every $x \in (a-\delta,a+\delta)$ (say). It is now easy to conlude that $|f(x)+g(x)|$ becomes as large as we wish, provided that $x$ is close to $a$. Try now to use some $M>0$ and $\delta$ to make this more formal.