I am trying to prove the following limit using an epsilon delta proof:
$\lim \limits_{x \to 1}$ ${x \over x+1}$ = ${1 \over 2}$
For every ${\epsilon>0}$ there is ${\delta>0}$ such that
If ${\lvert x-1 \rvert<\delta}$ then ${\lvert}$ ${x \over x + 1}$ ${-}$ ${1\over 2}$${\rvert}$.
I know from a proof that if ${x-1 <}$ ${1 \over 2}$ then ${\lvert}$ ${x \over x + 1}$ ${-}$ ${1\over 2}$${\rvert}$ ${\leq}$ ${1\over3}$${\lvert x-1 \rvert}$ (Proof omitted for brevity)
Am I allowed to assume that ${x-1 <}$ ${1 \over 2}$ because x is approaching 1? More generally, does an epsilon delta proof have to hold for all $x\in\mathbb{R}$ or can it hold true for some interval [a,b] that contains c (${x \to c}$)?
Suppose that you find a $\delta_1>0$ such that, for each $x\in(a,b)$ (where $(a,b)$ is an interval to which $c$ belongs)$$\lvert x-c\rvert<\delta_1\implies\bigl\lvert f(x)-l\bigr\rvert<\varepsilon.$$Now, take $\delta_2>0$ such that $(c-\delta_2,c+\delta_2)\subset(a,b)$. If you define $\delta=\min\{\delta_1,\delta_2\}$, then, for each real $x$, you have\begin{align}\lvert x-c\rvert<\delta\iff&\lvert x-c\rvert<\delta_1\text{ and }\lvert x-c\rvert<\delta_2\\\implies&\lvert x-c\rvert<\delta_1\text{ and }x\in(a,b)\\\implies&\bigl\lvert f(x)-l\bigr\rvert<\varepsilon.\end{align}So, yes, it is enough to find a $\delta$ such that$$\lvert x-c\rvert<\delta\implies\bigl\lvert f(x)-l\bigr\rvert<\varepsilon$$holds on an open interval to which $c$ belongs.