Subtle Analysis Problem

57 Views Asked by At

Suppose you have a function $f \colon A \to \mathbf {R} $ and $ (a - \delta', a + \delta') \subseteq A$ for some $\delta' > 0$. Suppose also that $f$ is continuous at $a$. How do you prove that the limit as $x$ approaches $a$ of $f(x) $ equals $f(a)$?

Definition of continuity: for all $\epsilon > 0$ there exists $\delta > 0$ such that $x \in A$ and $|x - a| < \delta$ imply $|f(x) - f(a)| < \epsilon$.

Definition of limit as $x$ approaches $a$ of $f(x)$ equals $f(a)$: For all $\epsilon > 0$, there exists $\delta > 0$ such that $0 < |x - a| < \delta$ implies $|f(x) - f(a)| < \epsilon$.

These definitions are pretty similar, but there are some subtleties I can't sort out. One is that $0 < |x - a|$ in the second definition. Another is that $x$ is not required to be in $A$ in the second definition. Also where does $\delta'$ come into play?

I saw a sketch of a solution once that involved taking $\min(\delta, \delta')$ or something similar to that, but I forgot the details and am trying to recapture them.