Is this method of proving an infimum of a set valid?

563 Views Asked by At

My question is this:

Given the set $$ A = \frac x{1+x^2} , x\geq 1$$

Prove $$ \inf A = 0 $$

Why can I not simply treat A as a function, $f(x)$, and say since $f'(x)$ is negative for all $x>1$, the infimum will be the equal to limit as x approaches infinity of the function?

I asked my TA for this course (real analysis) about this, and couldn't really get much of an answer other than "you shouldn't / it's better to use the definition so you get all the points on your test."

I'm curious as to why this wouldn't (or maybe it would?) work in this case. Thanks for any explanations!

4

There are 4 best solutions below

0
On BEST ANSWER

I would say they want you to use tools you have learned in analysis. If you are only at the level of learning and applying the definition of infimum, I imagine you have not seen rigorous proofs of fairly standard facts we use all the time from calculus. So if I was the exam marker and someone used this method of proof, I would also want a proof that if $f'(x_0)$ exists and is negative for all $x_0\in [a,\infty)$, then the function $f$ is strictly decreasing on $[a,\infty)$ and hence $\inf_{x\ge a}f(x)=\lim_{x\to\infty}f(x)$. You would also need to give an $\epsilon$-$\delta$ proof that the limit really is zero, which in this case would fairly closely follow the direct proof that the infimum is zero.

1
On

If $f' < 0$ starting at some point, then your function is strictly decreasing starting at that point and so you are right that if the lowest point is not achieved before that point where $f' < 0$ always, then the limit as $x \to \infty$ will be the infimum. However the case you have to worry about is when your function dips to some lower value beforehand, and then later on the derivative is always negative but the limit as $x \to \infty$ is not as low as that earlier low point. So technically you need to rule out that case. If you can, then your argument is fine regardless of what your TA says.

2
On

You simply show if $a > 0 \to \exists b \geq 1 : \dfrac{b}{b^2+1} < a$. Observe that: $b^2 + 1 > b^2 \to \dfrac{b}{b^2+1} < \dfrac{b}{b^2} = \dfrac{1}{b}$. If you take $b = \dfrac{1}{a} + 1 \to \dfrac{1}{b} = \dfrac{a}{a+1} < a$

0
On

I'd say they want you to demonstrate the method which does not depend on the defining expression being a differentiable or even continuous function. It also doesnt have to have a limit, nor (even if the limit exists) the limit needs be equal infimum.
As others said, to make your solution a complete proof you would have to show also the limit exists before you calculate it and prove the function is decreasing (so that the limit becomes an infimum).