Is a function always a monotonically increasing function

208 Views Asked by At

For $\lim_{x \rightarrow a}f(x) = L$, if $f(x)$ is not a constant, is $\delta(\epsilon)$ always a monotonically increasing function?

Alternatively, how does the definition of a limit guarantee that if $f(x)$ is not a constant, then a small $\epsilon$ will give me a small $\delta$?

3

There are 3 best solutions below

8
On BEST ANSWER

I'm trying to implement my idea above in comments. For each $n\in \Bbb N$ there's some $\delta_n>0$ such that for all $x$ with $0<|x-a|<\delta_{n}$ we have: $$|f(x)-L|<\frac{1}{n}$$

We can assume $\delta_n$ is decreasing. Now for each $0<\epsilon\le 1$, there's some $n_\epsilon\in \Bbb N$ such that: $$\frac{1}{2^{n_\epsilon+1}}<\epsilon\leq\frac{1}{2^{n_\epsilon}}$$

Let $$\delta_\epsilon=\delta_{2^{n_\epsilon+1}}$$ Then for all $x$ with $0<|x-a|< \delta_\epsilon$ we have: $$|f(x)-L|<\frac{1}{2^{n_\epsilon+1}}<\epsilon$$ and if $\epsilon_1<\epsilon_2$ then $n_{\epsilon_2}\le n_{\epsilon_1}$ and so

$$\delta_{\epsilon_1}=\delta_{2^{n_{\epsilon_1}+1}}\le \delta_{2^{n_{\epsilon_2}+1}}=\delta_{\epsilon_2}$$ So $\delta_{\epsilon}$ is increasing but not strictly.

7
On

You are free to choose $\delta_\epsilon$, so you could do it in a non monotonic way. For instance, with $f(x)=x$ and $a=L=0$, you could take $\delta_{\frac{1}{n}}=\frac{1}{n^2}$ for $n\geq 1$ and $\delta_\epsilon=\epsilon$ otherwise.

But if you take, for each $\epsilon>0$, the optimal $\delta_\epsilon$, then you get a nondecreasing function. I'll do it for a function between two metric spaces, more generally.

For each $\epsilon>0$, consider $\Delta_\epsilon$ the set of all $\delta>0$ such that $0<d(x,a)\leq \delta$ implies $|f(x)-L|\leq \epsilon$, that is $f(B'(a,\delta))\subseteq B(L,\epsilon)$. By definition of the limit, every $\Delta_\epsilon$ is non empty in $(0,+\infty)$. And the optimal $\delta$ is $ \delta_\epsilon:=\sup\Delta_\epsilon$, which is possibly infinite. Note that since I took large inequalities, when $\delta_\epsilon$ is finite, it belongs to $\Delta_\epsilon$. Note also that $\Delta_\epsilon$ is an interval. That is $\Delta_\epsilon=(0,\delta_\epsilon]$ in the finite case, and $(0,+\infty)$ otherwise.

Note: by $B'(a,\delta)$, I mean the closed ball of radius $\delta$ centered at $a$, intersected with the domain of $f$, with $a$ removed. By $B(L,\epsilon)$, I simply mean the closed ball of radius $\epsilon$ centered at $L$.

In any case, it is easily seen that, whenever $\epsilon<\epsilon'$, $\Delta_\epsilon\subseteq \Delta_{\epsilon'}$ whence $\sup\Delta_\epsilon\leq \sup\Delta_{\epsilon'}$, i.e. $\delta_\epsilon\leq \delta_{\epsilon'}$. This is simply because $B(L,\epsilon)\subseteq B(L,\epsilon')$, so $f(B'(a,\delta))\subseteq B(L,\epsilon)$ implies $f(B'(a,\delta))\subseteq B(L,\epsilon')$.

If $\delta_\epsilon=+\infty$ for every $\epsilon>0$, this means that $d(f(x),L)\leq \epsilon$ for every $x $ and every $\epsilon$. Thus $f(x)=L$ for every $x$ and $f$ is constant. So if $f$ is non constant, there exists $\epsilon_0>0$ such that $\delta_{\epsilon_0}<\infty$. And by monotonicity, $\delta_\epsilon\leq \delta_{\epsilon_0}<\infty$ for every $0<\epsilon\leq \epsilon_0$. And of course, if $f$ isconstant, then $\delta_\epsilon=+\infty$ for every $\epsilon>0$.

Finally, to answer your question. By monotonicity $\lim_{\epsilon\rightarrow 0}\delta_\epsilon$ exists, say it is $\delta$. We have $\delta\geq 0$. If we had $\delta>0$, then for every $0<d(x,a)|\leq \delta$, we get $0<d(x,a)\leq \delta_\epsilon$ for every $\epsilon>0$, so we would have $d(f(x),L)\leq \epsilon$. Hence $f(x)=L$ for every $0<d(x,a)\leq \delta$. That is $f$ is eventually constant. The converse is clear.

Conclusion: the optimal $\delta_\epsilon$ yields a nondecreasing function of $\epsilon$. It is eventually finite if and only if $f$ is not constant. And $\lim_{\epsilon\rightarrow 0}\delta_\epsilon=0$ (that is "a small $\epsilon$ gives you a small $\delta$") if and only if $f$ is not eventually constant. Note that $f(a)$ could be anything, in any case, if defined. So when I say (eventually) constant, that's up to this particular value.

Note: interestingly, even if $f$ is uniformly continuous (and even $1$-Lipschitz), the function $\epsilon\longmapsto \delta_\epsilon$ need not be continuous. Take $f(x)=x$ on $[0,1]$, $f(x)=2-x$ on $[1,2]$, and $f(x)=x-2$ on $[2,+\infty$. Then $\lim_{x\rightarrow 0}f(x)=0$. And for this limit, we have $\delta_\epsilon=2+\epsilon$ for $\epsilon\geq 1$, while $\delta_\epsilon=\epsilon$ for $0<\epsilon<1$. So $\delta_\epsilon$ is discontinuous at $1$.

0
On

It is easy to see that if you pick an optimal $\delta_\epsilon$ then for all $\epsilon < \epsilon'$ we have $\delta_{\epsilon} \leq \delta_{\epsilon'}$.

Indeed if $0< |x-a| < \delta_{\epsilon}$ then $|f(x)-L| < \epsilon < \epsilon'$ which means that $\delta_{\epsilon}$ works for $\epsilon'$. Hence $\delta_{\epsilon}$ is less or equal than the largest $\delta$ which works for $\epsilon'$, which is $\delta_{\epsilon'}$.

You cannot do better then this. For example we define $f:[-1,1]$ the following way: for $|x| \in (\frac{1}{n+1},\frac{1}{n}]$ let $f(x)=\frac{1}{n}$. $f(0)$ is irrelevant as you don't care about continuity.

Then $\lim_{x \to 0}f(x)=0$. And an easy computation shows that for all $\frac{1}{n+1}<\epsilon<\frac{1}{n}$ we have

$$\delta_{\epsilon} = \frac{1}{n+1}$$

This shows that $\delta_\epsilon$ is constant on $(\frac{1}{n+1},\frac{1}{n})$.