Is there any way to systematically do all epsilon delta proofs?

1.5k Views Asked by At

If you want to prove that the limit of $f(x)$ as $x$ to $a$ is equal to $L$ using the epsilon-delta definition of the limit, you need to solve the inequality

$$|f(x)-L|<\epsilon$$

for $x$, getting it into the form

$$|x-a|<\delta$$

for some $\delta$, which will in general be a function of $\epsilon$.

My question is, is there some way to calculate the function $\delta(\epsilon)$, short of solving the inequality above using the function $f$ you have?

Is it at least possible if $f$ is sufficiently well behaved? Like if $f$ is differentiable, can you calculate $\delta(\epsilon)$ using the derivative of $f$?

EDIT: This journal paper shows a formula for polynomials. If $f(x) = \sum_{n=0}^{k} a_n (x-a)^n$, then to prove that the limit of $f(x)$ as $x$ goes to $a$ equals $f(a)$, we can let $\delta = min(1,\frac{\epsilon}{ \sum_{n=1}^{k} |a_n|})$.

Can this be generalized to Taylor series? If $f(x) = \sum_{n=0}^{\infty} a_n (x-a)^n$, then can we prove that the limit of $f(x)$ as $x$ goes to $a$ equals $f(a)$ by letting $\delta = min(1,\frac{\epsilon}{ \sum_{n=1}^{\infty} |a_n|})$ ?

4

There are 4 best solutions below

4
On BEST ANSWER

Below I deal with the power series question. I'll use your notation and assume WLOG that $a=0.$

Here's a simple solution to the general $\delta = \varphi(\epsilon)$ question that uses a different idea. Suppose the radius of convergence of the series is $r\in (0,\infty).$ Then

$$f'(x) = \sum_{n=1}^{\infty}na_nx^{n-1},\,\,|x|<r.$$

Define $D=\sum_{n=1}^{\infty}n|a_n|(r/2)^{n-1}.$ Then for $|x|<r/2,$ the mean value theorem gives

$$|f(x)-f(0)| = |f'(c_x)||x| \le D|x|.$$

Thus $\delta = \min(r/2,\epsilon/D)$ is a solution.

Note that since $r = 1/\limsup |a_n|^{1/n},$ we really do have a formula for $\delta $ as a function of $\epsilon$ that depends only on the coefficients $a_1,a_2, \dots.$ Note also that in the case $r=\infty,$ we can replace $r/2$ by $1$ in the above, and everything goes through.

Now to your specific question: Does $\delta = \min(1,\epsilon/(\sum_{n=1}^{\infty}|a_n|))$ work? The answer is yes, assuming $\sum|a_n| < \infty.$

Proof: Because $\sum|a_n| < \infty,$ the power series defining $f$ has radius of convergence at least $1.$ Let $\epsilon>0.$ Set $\delta = \min(1,\epsilon/(\sum_{n=1}^{\infty}|a_n|)).$ If $|x|<\delta,$ then

$$|f(x)-f(0)| = |\sum_{n=1}^{\infty}a_nx^n|\le \sum_{n=1}^{\infty}|a_n||x|^n$$ $$ = |x| \sum_{n=1}^{\infty}|a_n||x|^{n-1} \le |x| \sum_{n=1}^{\infty}|a_n| <\epsilon.$$

This result covers all cases where the radius of convergence is greater than $1.$ But obviously the result fails if $\sum|a_n| = \infty.$ Here we are in the case where the radius of convergence $r$ is a number in $(0,1].$ This can be handled by scaling into the $\sum|a_n| < \infty$ situation, and then scaling back. But the answer isn't as simple in this case. Since Micah's answer already covers this argument, I'll omit it here. (Note that the first method I mentioned, involving $f'(x),$ does not require this scaling argument.)

6
On

For continuous functions where L=f(a), you need somehow find a relation $$ |f(x)-f(a)|\le M |x-a|$$ where $M$ depends on your $ x$, $a$, and $\epsilon.$

Then you want to make $M\delta <\epsilon.$

The process goes from trivial to very tricky depending on f(x).

For example $f(x)=1/x$ and $a=0.25$ requires showing $|1/x - 4| \le M|x-0.25|$ for some $M>0.$

Knowing that $|1/x-1/0.25|= \frac {|x-0.25|}{|0.25x|}$ and if $|x-0.25|<0.1$, then $0.25x>0.0375$, we can choose $M=1/.0375\approx 26.66,$ and choose $\ \delta = \min \{0.1,\epsilon /27\}$. Thus if $|x-0.25|<\delta $, then we have $|1/x-4|=\frac {|x-0.25|}{|0.25x|}<(\epsilon /27)(\frac{1}{0.0375})<\epsilon$

2
On

Yes, if one can find an upper bound $B$ for $\left\vert\frac{f(x)-L}{x-a} \right\vert$ on some deleted $p$-neighborhood of $a$ then $\delta=\min\left\{p,\frac{\epsilon}{B}\right\}$

Lemma: If for some $p>0$ and $x\in(a-p,a)\cup(a,a+p) $, it is true that

$$\left\vert\frac{f(x)-L}{x-a} \right\vert\le B$$

for some $B>0$ then $\lim_{x\to a}f(x)=L$.

Proof: Let $\epsilon>0$.

Let $\delta=\min\left\{ p,\frac{\epsilon}{B}\right\}$ and $|x-a|<\delta$. Then $|x-a|<p$, so $\left\vert\frac{f(x)-L}{x-a} \right\vert\le B$. Furthermore, $|x-a|<\frac{\epsilon}{B}$. Thus $\vert f(x)-L|\vert<\epsilon$. So $\lim_{x\to a}f(x)=L$.

This approach can work when the function does not have a vertical tangent at $x=a$.

Example: Prove $\displaystyle\lim_{x\to 2}\frac{x^3+2x}{x+2}=3$

Then we must find an upper bound $B$ on some interval about $a=2$. It is simplest first to try $p=1$ and adjust the value later if necessary.

So we try to find an upper bound $B$ on $\left\vert\frac{f(x)-L}{x-a} \right\vert$ on the set $(1,2)\cup(2,3)$.

A bit of algebra will show that $\left\vert\frac{f(x)-L}{x-a} \right\vert=\vert(x+1)^2+2\vert$ on $(1,2)\cup(2,3)$. Since $(x+1)^2+2$ is a parabola, concave up with vertex $(-1,2)$ it is increasing on $(1,2)$ and $(2,3)$ so it will have its largest value, $18$ at $x=3$. Thus we may let $B=18$.

Now let $\epsilon>0$, $\delta=\min\left\{1,\frac{\epsilon}{18}\right\}$ and $\vert x-2\vert<\delta$. Then $|x-2|<1$ so $\vert(x+1)^2+2\vert=\left\vert\frac{\frac{x^2+2x}{x+2}-3}{x-2}\right\vert<18$. Furthermore, $|x-2|<\frac{\epsilon}{18}$ so $|x-2|\cdot\left\vert\frac{\frac{x^2+2x}{x+2}-3}{x-2}\right\vert=\left\vert\frac{x^2+2x}{x+2}-3\right\vert<18\cdot\frac{\epsilon}{18}=\epsilon$

So $\displaystyle\lim_{x\to 2}\frac{x^3+2x}{x+2}=3$

5
On

Suppose $f$ can be written as a power series around $0$: that is, $f(x)=\sum_{i=0}^\infty a_n x^n$ for some sequence $\{a_n\}$. We'll examine the continuity of $f$ at zero. (Of course, you could shift the power series to some other point and this analysis would apply there as well.)

We'll also start by assuming that $f$ has a radius of convergence which is strictly greater than $1$: this implies that $\sum |a_n|$ is convergent. Later on we'll remove this assumption. Let $P_n$ be the $n$th partial sum of the series. Fix $\epsilon>0$ and let $\delta_n=\min\left(1, \frac{\epsilon/2}{\sum_{i=1}^n |a_i|}\right)$. Then, by the linked paper, if $|x|<\delta_n$, then $|P_n(x)-a_0|<\epsilon/2$.

Now, take $\delta=\min\left(1, \frac{\epsilon/2}{\sum_{i=1}^\infty |a_i|}\right)$. Then $\delta \leq \delta_n$ for all $n$. So, if $x<\delta$, then $|P_n(x)-a_0|<\epsilon/2$ for all $n$: that is, $P_n(x)$ lies in the open $(\epsilon/2)$-ball around $a_0$ for all $n$. Since $\lim_{n \to \infty} P_n(x)=f(x)$, it follows that $f(x)$ lies in the closure of that ball. That is, we have $|f(x)-a_0|\leq \epsilon/2 < \epsilon$ whenever $|x|<\delta$. So, for any $\epsilon>0$, we can do our $\epsilon$-$\delta$ proof with $\delta=\min\left(1,\frac{\epsilon/2}{\sum_{i=1}^\infty |a_i|}\right)$.


This works when $f$ has a large enough radius of convergence, but what about the general case? In general, to say that $f$ can be written as a power series around $0$ is to say that it has some positive radius of convergence. That is, $R=\frac{1}{\limsup (a_k^{1/k})}$ is positive. Fix some $r<R$ (for definiteness, we could take $r=R/2$).

Now, let $$g(x)=f(x/r)=\sum_{i=0}^\infty \left(\frac{a_n}{r^n}\right)x^n$$ This is a power series with a radius of convergence $R/r$, which is strictly greater than $1$, and so we can apply our previous result to $g$. That is, given any $\epsilon>0$, let $\delta_g=\min\left(1,\frac{\epsilon/2}{\sum_{i=1}^\infty |a_i|/r^i}\right)$. Then, if $|x|<\delta_g$, $|g(x)-a_0|<\epsilon$.

Now, let $\delta=r\delta_g$. If $|x|<\delta$, then $x/r<\delta_g$, and so $|f(x)-a_0|=|g(x/r)-a_0|<\epsilon$. It follows that, for any $f$ which can be written as a convergent power series in a neighborhood of $0$, we can do our $\epsilon$-$\delta$ proof with $\delta=r\delta_g=r\min\left(1,\frac{\epsilon/2}{\sum_{i=1}^\infty |a_i|/r^i}\right)$.


This answers the question in your edit. In all fairness I should say that I don't think it does a very good job of answering your initial question: being equal to a convergent power series in the neighborhood of a point is a highly restrictive property! (I actually think the deleted answer, which works for any continuously differentiable function, is in many ways superior to this one...)