Proving Limits Using Sequences

18 Views Asked by At

Prove that if $$\lim_{x \to c}f(x)=L$$ and $$\lim_{x \to c}g(x)=c$$ then $$\lim_{x \to c}f(g(x))=L$$, But the converse is not true. $$$$ $\lim_{x \to c}f(x)=L$ implies that we can keep $f(x)$ as close to $L$ as we want by keeping $x$ sufficiently close to $c$ and $\lim_{x \to c}g(x)=c$ implies that we can keep $g(x)$ as close to $c$ as we want by keeping $x$ sufficiently close to $c$ so we can keep $f(g(x))$ as close to $L$ as we want by keeping $g(x)$ sufficiently close to $c$ which can be done by keeping $x$ close to $c$ $$$$ And for the converse if $$\lim_{x \to c}f(g(x))=L$$ and $\lim_{x \to c}g(x)=c$ that doesn't imply $\lim_{x \to c}f(x)=L$ because $\lim_{x \to c}f(g(x))=L$ implies that $g(x)$ approaches $c$ by taking a special sequence of values and corresponding to this $f(g(x))$ approaches $L$ but for $\lim_{x \to c}f(x)$ to exist $f(x)$ must approach $L$ as $x$ approaches $c$ by taking any sequence of values. $$$$Am I Right??

1

There are 1 best solutions below

0
On

Your resoning to the first part is correct (but should be made more precise). The second part is actually easier to show. Consider $g\equiv c$ and $f=1_{\{c\}}$ (here $L=1$). Then $\lim_{x\to c}f(g(x))=\lim_{x\to c}f(c)=f(c)=1=L$ but $\lim_{x\to c}f(x)=0$.