Proving $f: A \to R$ is continuous at $a \in A$ knowing $(a − \delta', a + \delta') \subset A$ for some $\delta' > 0$

43 Views Asked by At

I've been working on this question for a while now and I can't seem to figure it out.

Suppose $f: A \to R$ is a function and $A$ contains an interval $(a − \delta', a + \delta')$ for some $\delta' > 0$. Prove that if $f$ is continuous at $a$, then $\lim_{x\to a} f(x) = f(a)$.

Could anybody give some tips or ways to prove this, thanks!

1

There are 1 best solutions below

1
On

If you know that $f$ is continuous at $a$, then for all $\epsilon >0$ you can find a $\delta>0$ such that if $x$ satisfies $|x-a|<\delta$ then $|f(x)-f(a)|<\epsilon$. This is equivalent to the definition of the limit of $f$ at $a$. In particular when $\epsilon$ is small enough then the $\delta$ that works will satisfy $\delta < \delta'$ eventually, but all that matters is that $\epsilon$ and $\delta$ get arbitrarily small. So you can conclude that $\underset{x \to a}{\lim} f(x) = f(a)$.