Given $$\lim_{x\to a}f(x)=\infty$$ $$\lim_{x\to a}g(x)=c, c\in\mathbb R$$
Show that $$\lim_{x\to a}[f(x)+g(x)]=\infty.....(1)$$
I started off by using the precise definition of both limits: $(\forall M>0)(\exists\delta>0)[0<\vert x-a \vert < \delta \implies f(x)>M]$ $(\forall \epsilon>0)(\exists\delta>0)[0<\vert x-a \vert < \delta \implies \vert g(x)-c \vert < \epsilon]$
What I wanted is: $(\forall N>0)(\exists\delta>0)[0<\vert x-a \vert < \delta \implies f(x)+g(x)>N]$
I started using $(1)$: $$\vert [f(x)+g(x)] - (C+M) \vert =\vert f(x)-M + g(x)-c \vert$$ $$\le \vert g(x)-c \vert + \vert f(x)-M \vert$$
I know that $\vert g(x) - c\vert<\epsilon$ and $f(x)-M>0$, but how do I show that $f(x)+g(x)>N$?
Am I doing this right so far? Many thanks for your help in advance!
Let $M>0$ and $\epsilon>0$ .
$\exists \delta_1>0$ such that $\forall x:|x-a|< \delta_1 \Rightarrow |f(x)|>M+|c|+ \epsilon$
$\exists \delta_2>0$ such that $\forall x:|x-a|< \delta_2 \Rightarrow |g(x)-c|< \epsilon$
Take $\delta=\min\{\delta_1,\delta_2\}$ and we have that $\forall x:|x-a|< \delta$
$$|f(x)+g(x)| \geq|f(x)|-|g(x)-c+c| \geq|f(x)|-|c|-\epsilon>M+|c|+ \epsilon-|c|-\epsilon=M$$
Thus $\lim_{x \rightarrow a}(f(x)+g(x))= +\infty$