Theorem: Let $A \subset \mathbb{R}$ and let $f: A \to \mathbb{R}$ be continuous at a point $c \in A$. Show that for any $\epsilon > 0$, there exists a neighborhood $V_{(\delta)}(c)$ of $c$ such that if $x, y \in A \cap V_{(\delta)}(c)$, then $|f(x)-f(y)|<\epsilon$
Proof: Since $f$ is continuous at $c \in A$ then, $ \forall \epsilon >0$ $\exists \delta_{1} >0$ such that if $x \in A$ and $|x-c|<\delta_1$then $|f(x)-f(c)|<\epsilon/2$
Similarly, $ \forall \epsilon >0$, $\exists \delta_{2} >0$ such that if $y \in A$ and $|y-c|<\delta_2$then $|f(y)-f(c)|<\epsilon/2$
Choose $\delta = min({\delta_{1}, \delta_{2}})$
Then $-\epsilon/2 < f(x)-f(c) <\epsilon/2$
Implies $-\epsilon<f(x)-f(c)-(f(y)-f(c))<0$
Implies $-\epsilon<f(x)-f(y)<0$
Implies $|f(x)-f(y)|<\epsilon$
Can anyone please verify this proof and let me know if it is correct?
Thank you.