Let's say we have a function $f$ defined on an interval centered $a \in \mathbb{R}$, and $x \in \mathbb{R}$.
If we know that $$ |x - a| < \delta \implies |f(x) - L'| < \varepsilon $$ where $L' \in \mathbb{R}$ and $\delta, \varepsilon > 0$ are just any two random numbers (not a full definition).
I think that we cannot reasonably deduce that $$ \lim_{x \to a} f(x) = L'$$ because the definition of limit says that for every epsilon there exists a delta, where here we are only taking the case of epsilons greater than or equal to whatever the value of epsilon we chose was.
But what can we say about $L'$ with respect to the actual $\lim_{x \to a} f(x)$?
EDIT: Forgot to add that we know that $\lim_{x \to a} f(x)$ exists.
At first glance, it looks like the actual limit could be anything in the range $[L' - \varepsilon, L' + \varepsilon]$, but I cannot mathematically conclude that.
The limit does not have to exist at all. For example $$f(x)=\left\{\begin{array}{cc}\sin\frac{1}{x},& x\neq 0\\ 0,& x=0\end{array}\right.$$ Then choose $L'=0$, $a=0$, $\varepsilon=1.1$ and $\delta=1$.
Edit: If the limit exists, we have $|\lim_{x\rightarrow a}f(x)-L'|\leq \varepsilon$. Let me show you how. Pick a sequence $x_n$ such that $x_n\rightarrow a$. Then we find an $N_0$ such that for all $n\geq N_0$ we have $|x_n-a|<\delta$. Therefore these $x_n$ satisfy $|f(x_n)-L'|<\varepsilon$. Since $|\cdot|$ is continuous the result follows.