Epsilon-delta proof that $ \lim_{x\to 0} {1\over x^2}$ does not exist

6.8k Views Asked by At

I'd like to see an epsilon delta proof that the $\lim: \lim_{x\to 0} {1\over x^2}$ does not exist and an explanation of the exact reason it does not exist, because I am not so sure I believe that the limit does not, in fact, exist, so I need to be proved wrong.

What is the relationship between a limit existing, and the function in question having a least upper bound? Because it seems to me that the only explanation I can find as to why the limit does not exist is that the function is unbounded.

I'm not sure why this is relevant because it seems to me that when $x$ approaches $0$ then ${1\over x^2}$ gets infinitely close to the y-axis which suggests to me that there does exist, in fact, an epsilon infinitely close to zero such that if $|x - a| < \delta$ then $|f(x)-L| < \epsilon$ where $\delta$ is infinitely close to zero and $\epsilon$ is infinitely close to zero.

Obviously, my understanding of calculus hinges on this question, so I really need to be convinced with a bulletproof explanation, otherwise I'll continue to doubt the truth (I don't believe anything unless I fully understand it myself, for better or worse, I ignore other's authority and rely only on proof and logical understanding -- I'm sorry if this attitude offends anyone)! Thanks in advance!

5

There are 5 best solutions below

8
On BEST ANSWER

" What is the relationship between a limit existing, and the function in question having a least upper bound? Because it seems to me that the only explanation I can find as to why the limit does not exist is that the function is unbounded."

BINGO!

A limit existing, $\lim_{x\rightarrow a}f (x)=c $, means for every $\epsilon >0$ there is a $\delta $ so that whenever $|x-a|<\delta $ it also follows $|f (a)-c|<\epsilon $. That is impossible if $f $ is unbounded in all intervals around $0$.

Let $c $ be any real number. Clearly we can find $x $ close to $0$ so that $\frac 1{x^2} >c $.

To put it in terms of limits if $c $ is any arbitrary real number then for any $\epsilon >0$ we CAN'T find a $\delta $ so that $|x-0|=|x|<\delta $ would mean $|f (x)-c|=|\frac 1 {x^2} - c | < \epsilon $.
We can't do this as for any because for any $\epsilon $ and $c $ and any $\delta >0$ we can find $0 <x < \min (\sqrt {\max (c+\epsilon,\epsilon)},\delta) $ and $\frac 1 {x^2} > c+\epsilon $ so $|\frac 1 {x^2}-c|>\epsilon $ so there is no $\delta$ where the condition must be true.

So the limit can not be $c $. So the limit can't be any real number.

0
On

I think you have the idea backwards. You are wanting to know if the $y$ values of the function converge to something or not as $x$ approaches $0$. That epsilon part, $|f(x)-L|$ is looking at the $y$ values, not the $x$-values. It is clear that as $x \to 0$, $f(x)=1/x^2$ becomes increasingly big. For instance if $x=1/10$, $f(\frac{1}{10})=100$. If $x=1/100$, $f(\frac{1}{100})=10000$, etc. Therefore, there is no fixed $L\in \mathbb{R}$ for which it could converge. This limit is infinity.

4
On

I'll try to sketch the idea of the proof without giving all the details, since that part of the exercise is your job.

In order for the limit to exist there must some particular number $L$ for the limit. Then the function's value is will be close to $L$ whenever $x$ is close enough to $0$.

Since $1/x^2$ is very large when $x$ is near $0$, the limit $L$ will have to be a large number if it is to exist. Could the limit be $1000$? I don't think so. You should be able to show that if you look at values of $x$ close to $0$ then $1/x^2$ will be greater than $1001$, which tells you there's no $\delta$ that works with $\epsilon = 1$.

Now write the formal proof that no number $L$ can be the limit. So there is no limit.

PS Don't say that the limit is infinity. Infinity is not a number. There is a sense in which it's correct to say the limit is infinity, but this exercise does not ask about that and you can get into trouble if you try.

PPS Phrases like "infinitely close" sometimes provide useful intuition, and they were (in a sense) the best that mathematicians managed when calculus was being invented, but you should not use them now in formal arguments. The whole $\epsilon - \delta$ thing is designed to express the idea precisely. I think you can understand many of the ideas of calculus without it, but you can't prove theorems that way.

1
On

An epsilon-delta proof is used to show that the limit exists and is $L$, not usually to show that no limit exists. We can see where it fails. Suppose we claim that $\lim_{x \to 0}\frac 1{x^2}=L$ If somebody gives us an $\epsilon \gt 0$ we have to find a $\delta \gt 0$ such that $|x| \lt \delta \implies |f(x)-L|=|\frac 1{x^2}-L| \lt \epsilon$. The problem is that if $x$ is very small, $\frac 1{x^2}$ is very large and can certainly be larger than $L+\epsilon$.

You are confusing the fact that the graph of $y=\frac 1{x^2}$ gets close to the $y $ axis, which really means $\lim_{x \to 0} x=0,$ with the value of $\frac 1{x^2}$ getting close to a value, which it does not.

0
On

Do some examples. Let $L=57,345$. Let $\epsilon=.0000003$. If $|x| < .000,0001$ then $f (x) > 100000000000000$ so $|f (x)-L|> 99999999942655 > .0000003$.

It doesn't matter what delta we choose. If we choose a large delta we can still pick a very small $x $ to get a large $f (x)-L $. And if we pick a small $\delta $ it only makes $|f (x)-L|$ larger.

And no matter how large we make $L $ we can always find a small enough $x $ to make $f (x)-L$ very large rather than small. Example. Let $L=10^{(10^{100})} $ and $0 <\epsilon <1$. We simply have to pick $x<10^{10^{100}} $ and $f (x)-L=10^{2*10^{100}}-10^{10^{100}}=10^{10^{100}}(10^{10^{100}} -1)$ which last time a checked was a teeny bit more than $1$ which is larger than $\epsilon $