I'm reading a few proofs of the squeeze theorem to try to understand it and i always find the same inconsistency, for example on this proof on Wikipedia https://en.wikipedia.org/wiki/Squeeze_theorem#Proof
on this particular step:
knowing that
1) $|x-a|<\delta_1 => |g(x)-L|<\epsilon$
2) $|x-a|<\delta_2 => |f(x)-L|<\epsilon$
3) $g(x)-L<h(x)-L<f(x)-L$
we choose a $min(\delta_1,\delta_2 )$ then it holds that
$$-\epsilon < g(x)-L<h(x)-L<f(x)-L$ < \epsilon$$
Ok, so my question is, on lines 1) and 2) why we have the same $\epsilon$? wouldn't it be different for a different function?, even after taking $min(\delta_1,\delta_2 )$, why would both $\epsilon_1$ and $\epsilon_2$ be equal so that you can combine them on a single expression later on?
1) $|x-a|<min(\delta_1,\delta_2 ) => |g(x)-L|<\epsilon_1$
2) $|x-a|<min(\delta_1,\delta_2 ) => |f(x)-L|<\epsilon_2$
let me know if any clarification is needed.
Let me redo the definition and proof to emphasize points that are implicit in your text but elided over and causing you confusion (reasonable confusion). The pertinent emphasis is in itatics and note the subscript of the deltas:
Definition $\lim_{x\rightarrow a} f(x) = L$ if for any possible $\epsilon > 0$ at all there will exist a $\delta_{f,\epsilon} > 0$ that is dependent upon both the function $f$ and the choice of $\epsilon$ so that whenever we have $|x - a| < \delta_{f,\epsilon}$ it will always follow that $|f(x) - L | < \epsilon$.
Squeeze Theorem: We have $\lim_{x\rightarrow a} f(x) = L$ and $\lim_{x\rightarrow a} g(x) = L$ and $g(x) \le h(x) \le f(x)$ for all $x$ then $\lim_{x\rightarrow a} h(x) = L$.
Proof: We need to show that for any arbitrary $\epsilon > 0$ we can choose we will be able to find a $\delta_{h,\epsilon} > 0$ that is dependent on the function $h$ and our choice of $\epsilon$ so that $|x-a|< \delta_{h,\epsilon}$ will imply $|h(x) - L| < \epsilon$.
For any arbitrary $\epsilon$ we know that a $\delta_{f,\epsilon}$ exists that will ... do the stuff... for $f$ and for $\epsilon$. And we know that for the same $\epsilon$ we can find a different $\delta_{g,\epsilon}$ that will ... do the stuff ... for $g$ and for the same $\epsilon$. (Because such a $\delta$ will exist for every positive value, it will have to exist for the same $\epsilon$.)
We claim that if $\delta_{h,\epsilon} = \min (\delta_{g,\epsilon},\delta_{g ,\epsilon})$ then we will be able to show that $\delta_{h,\epsilon}$ will do the stuff for the function $h$ and the same arbitrary value of $\epsilon$.
.... and then we do the proof.
==== old answer ====
No.
The point is the function $f,g$ to have limits the condition must be true for ALL possible $\epsilon$s. They will both be true for all $\epsilon$s So there is no need at all to use different $\epsilon$s and we want to use the same one to get a universal result.
In fact if we had to use a specific $\epsilon$ (rather than any arbitrary $\epsilon$) to get the result, that would mean the function does not have that limit at that point.