For context: I understand the basic $\epsilon$-$\delta$ definition of a limit, but have not learned anything pertaining to limits that do not exist, other than the fact that the limit does not exist.
I am working on the following:
Suppose that $\lim_{x\to 0}f(x)$ exists and is non-zero. Prove that if $\lim_{x\to 0}g(x)$ does not exist, then $\lim_{x\to 0}f(x)g(x)$ does not exist.
Source: Calculus by Spivak
My ideas:
- Using contradiction is one idea that I had. Something along the lines of: Suppose that $\lim_{x\to 0}f(x)g(x)=l$, and let $\epsilon>0$ be arbitrary. Because $\lim_{x\to 0}f(x)=c\neq 0$, $|x|<\delta_1$ implies $|f(x)-c|<\epsilon$. We also have that $|x|<\delta_2$ implies $|f(x)g(x)-l|<\epsilon$. But I'm not sure how to manipulate the expressions past this point.
- I also thought of using the fact that $\lim_{x\to a}f(x)g(x)=\lim_{x\to a}f(x)\cdot\lim_{x\to a}g(x)$ to obtain $\lim_{x\to 0}f(x)g(x)=c\cdot\lim_{x\to 0}g(x)$, but I don't know how to show that $c\times\text{non-existent limit}=\text{non-existent limit}$.
The definition of limit is the following:
Let $a\in X\subset\mathbb{R}$ and $f:X\rightarrow \mathbb{R}$. We say the limit of $f$ when x approach to $a$ is $L$ (and we write $\lim_{x\rightarrow a}f(x)=L$) if, given $\epsilon>0$, exists $\delta=\delta(\epsilon)>0$ such that, $x\in X,$
$$<0|x-a|<\delta\Rightarrow |f(x)-L|<\epsilon $$
So, to prove that the limit of $f$ when $x$ approach to $a$ doesn't exists if, for all $A\in X$ exists $\epsilon>0$ such that, for all $\delta>0$
$$0<|x-a|<\delta\Rightarrow |f(x)-A|\geq \epsilon. $$
So, let's go to your question, considerating the domain of $f$ and $g$ to be $\mathbb{R}$:
Suppose that $\lim_{x\rightarrow 0}f(x)=L\neq 0.$ So, given $\epsilon >0$, exists $\delta>0$ such that, $x\in\mathbb{R}$ $$|x|<\delta\Rightarrow |f(x)-L|<\epsilon. $$
Suppose that $\lim_{x\rightarrow 0}g(x)$ does not exists. So, for all $A\in\mathbb{R},$ exists $\epsilon>0$ such that for all $\delta>0,x\in\mathbb{R}$,
$$|x|<\delta\Rightarrow |g(x)-A|\geq \epsilon $$
So, to prove that the limit of $f(x)g(x)$ when $x\rightarrow 0$ is equivalent to prove that, for all $B\in\mathbb{R}$, exists $\epsilon >0$ such that for all $\delta >0, x\in\mathbb{R}$,
$$|x|<\delta\Rightarrow |f(x)g(x)-A|\geq \epsilon.$$
Try to combinate theses equations. Can you get it from here?