For a homework assignment, I was told to formally prove the squeeze theorem for functions. I was able to come up with an appropriate proof except for one final step.
Theorem: Let $f$, $g$ and $h$ be functions $D\to\Bbb R$, where $D\subseteq\Bbb R$, and let $c$ be an accumulation point of $D$. Suppose that $f(x)\le g(x)\le h(x)$ for all $x\in D\setminus\{c\}$. If $\lim\limits_{x\to c}f(x) = \lim\limits_{x\to c}h(x) = L\in\Bbb R$, then $\lim\limits_{x\to c}g(x) = L$.
Proof: By definition, for all $\newcommand{\e}{\varepsilon} \e>0$ there exists a $\newcommand{\d}{\delta} \d_f>0$ such that $\newcommand{\l}{\lvert} \newcommand{\r}{\rvert} \l f(x)-L\r<\e$ whenever $\l x-c\r<\d_f$. By definition, for all $\e>0$ there exists a $\d_h>0$ such that $\l f(x)-L\r<\e$ whenever $\l x-c\r<\d_h$.
This means that $-\e+L<f(x)<\e+L$ and that $-\e+L<h(x)<\e+L$.
Combining with the given inequality yields $-\e+L<f(x)\le g(x)\le h(x)<\e+L$. Paraphrasing yields $-\e+L<g(x)<\e+L$.
Let $\d_g = \min\{\d_f, \d_h\}$. Thus for all $\e>0$ there exists a $\d_g>0$ such that $-\e+L<g(x)<\e+L$ whenever $\l x-c\r<\d_g$.
Therefore by definition $g(x)$ converges to $L$ as $x\to c$. $\blacksquare$
I don’t understand why the proof requires $\min$ and not $\max$. In my mind, if I want to ensure that $\l x-c\r<\d_g$, then I should make $\d_g$ as big as possible, not as small as possible.
Could someone shed some light on this apparent discrepancy?
If $|x-c| <\max \{\delta_f,\delta_h\}$ then you can only conclude that $|x-c|$ is less than one of the numbers $\delta_f,\delta_h$ and not both. But then either the inequality for $f(x)$ or the inequality for $h(x)$ will hold. We need both the inequalities to hold. So we have to take the $\min \{\delta_f,\delta_h\}$ and not the $\max \{\delta_f,\delta_h\}$.