Prob. 2, Sec. 6.3, in Bartle & Sherbert's INTRO TO REAL ANALYSIS, 4th ed: If $f\to A>0$, $g>0$, and $g\to 0$, then $f/g\to +\infty$; ...

476 Views Asked by At

Here is Prob. 1, Sec. 6.3, in the book Introduction to Real Analysis by Robert G. Bartle & Donald R. Sherbert, 4th edition:

Suppose that $f$ and $g$ are continuous on $[a, b]$, differentiable on $(a, b)$, that $c \in [a, b]$ and $g(x) \neq 0$ for $x \in [a, b]$, $x \neq c$. Let $A \colon= \lim_{x\to c} f$ and $B \colon= \lim_{x\to c} g$. If $B = 0$, and if $\lim_{x \to c} f(x)/g(x)$ exists in $\mathbb{R}$, show that we must have $A=0$. [ Hint: $f(x) = \big\{ f(x) / g(x) \big\} g(x)$.]

Here is my Math SE post on this problem.

And, here is Prob. 2, Sec. 6,3:

In addition to the suppositions of the preceding exercise, let $g(x) > 0$ for $x \in [a, b]$, $x \neq c$. If $A > 0$ and $B = 0$, prove that we must have $\lim_{x \to c} f(x)/g(x) = \infty$. If $A < 0$ and $B = 0$, prove that we must have $\lim_{x \to c} f(x)/ g(x) = -\infty$.

My Attempt:

Suppose that $f$ and $g$ are real-valued functions defined on the closed interval $[a, b]$ and that $c \in [a, b]$ is such that

(i) $g(x) > 0$ for $x \in [a, b]$ such that $x \neq c$,

(ii) $\lim_{x \to c} f(x)$ exists in $\mathbb{R}$, and

(iii) $\lim_{x \to c} g(x) = 0$.

Let us put $A \colon= \lim_{x \to c} f(x)$ and $B \colon= \lim_{x \to c} g(x)$. Then $B = 0$ of course.

We study the following two cases according as $A> 0$ or $A < 0$.

Case 1. First suppose that $A > 0$.

Let $\alpha \in \mathbb{R}$ be arbitrary. Then as $\lim_{x \to c} f(x) = A$, so there exists a real number $\delta_1 > 0$ and depending on $A > 0$ such that $$ \big\lvert f(x) - A \big\rvert < \frac{A}{2} $$ for all $x \in [a, b]$ such that $0 < \big\lvert x-c \big\rvert < \delta_1$. Therefore we have $$ f(x) > \frac{A}{2} > 0 \tag{1} $$ for all $x \in [a, b]$ such that $0 < \big\lvert x-c \big\rvert < \delta_1$.

Now as $\lim_{x \to c} g(x) = 0$, and as $$ \frac{A}{2 \big( \lvert \alpha \rvert + 1 \big) } > 0, $$ so there exists a real number $\delta_2 > 0$ and depending on $\alpha$ (and $A$) such that $$ 0 < g(x) = \big\lvert g(x) \big\rvert = \big\lvert g(x) - 0 \big\rvert < \frac{A}{2 \big( \lvert \alpha \rvert + 1 \big) } $$ for all $x \in [a, b]$ such that $0 < \lvert x-c \rvert < \delta_2$. Therefore we must also have $$ \frac{1}{g(x)} > \frac{ 2 \big( \lvert \alpha \rvert + 1 \big) }{A} \tag{2} $$ for all $x \in [a, b]$ such that $0 < \lvert x-c \rvert < \delta_2$.

Let us put $\delta \colon= \min \left\{ \delta_1, \delta_2 \right\}$. Then $\delta > 0$ since both $\delta_1 > 0$ and $\delta_2 > 0$.

Then from (1) and (2) above, we can conclude that for all $x \in [a, b]$ such that $0 < \lvert x-c \rvert < \delta$, we must have $$ \frac{ f(x) }{ g(x) } = f(x) \frac{1}{g(x)} > \frac{A}{2} \frac{ 2 \big( \lvert \alpha \rvert + 1 \big) }{A} = \lvert \alpha \rvert + 1 > \lvert \alpha \rvert \geq \alpha. $$

Thus we have shown that, corresponding to any given real number $\alpha$, there exists a real number $\delta > 0$ and depending on $\alpha$ such that $$ \frac{ f(x) }{ g(x) } > \alpha $$ for all $x \in [a, b]$ such that $0 < \lvert x-c \rvert < \delta$.

Therefore by virtue of Definition 4.3.5 (i) in Bartle & Sherbert, 4th edition, we conclude that $$ \lim_{x \to c} \frac{ f(x) }{ g(x) } = +\infty. $$

Case 2. Now suppose that $A < 0$.

Let us choose an arbitrary real number $\beta$. Then as $A< 0$, so $-A>0$, and there exists a real number $\delta_1 > 0$ and depending on $A$ such that $$ \big\lvert f(x) - A \big\rvert < \frac{-A}{2} $$ and so $$ f(x) < \frac{A}{2} < 0 \tag{3} $$ for all $x \in [a, b]$ such that $0 < \lvert x-c \rvert < \delta_1$.

Also there exists a real number $\delta_2 > 0$ and depending on $\beta$ (and also on $A$) such that $$ 0 < g(x) = \big\lvert g(x) \big\rvert = \big\lvert g(x) - 0 \big\rvert < \frac{ A }{ -2 \big( \lvert \beta \rvert + 1 \big) } $$ and so $$ \frac{1}{g(x) } > \frac{-2 \big( \lvert \beta \rvert + 1 \big) }{ A } > 0 \tag{4} $$ for all $x \in [a, b]$ satisfying $0 < \lvert x-c \rvert < \delta_2$.

Then, for all $x \in [a, b]$ for which $0 < \lvert x-c \rvert < \min\left\{ \delta_1, \delta_2 \right\}$, we also have $$ \begin{align} \frac{ f(x) }{ g(x) } &= f(x) \frac{1}{g(x)} \\ &< f(x) \left( \frac{-2 \big( \lvert \beta \rvert + 1 \big) }{ A } \right) \qquad [ \mbox{because of (4) and the fact from (3) that $f(x) < 0$} ]\\ &< \frac{A}{2} \left( \frac{-2 \big( \lvert \beta \rvert + 1 \big) }{ A } \right) \qquad [ \mbox{ Refer to (3) and (4) above again. } ] \\ &= - \lvert \beta \rvert - 1 \\ &< -\lvert \beta \vert \\ &\leq \beta. \end{align} $$

Since $\beta$ was an arbitrarily chosen real number, it follows from Definition 4.3.5 (ii) that $$ \lim_{x \to c} \frac{f(x)}{g(x)} = -\infty. $$

Is this proof correct and rigorous enough for Bartle & Sherbert? If so, then is there any necessity for the assumptions of continuity of $f$ and $g$ on $[a, b]$ and differentiability of these functions on $(a, b)$?

Or, what is missing in my proof?

Last but not least, do we really need to assume the differentiability of functions $f$ and $g$ on $(a, b)$? or the continuity of functions $f$ and $g$ on $[a, b]$?

2

There are 2 best solutions below

0
On

I agree with you that neither continuity nor differentiability is needed for these results.

I haven't read through your proof carefully, but it sure looks correct to me, and certainly rigorous enough for B&J (especially given that they're throwing in unnecessary assumptions!)

I did notice one thing: if you have proved it for $A>0,$ then you can obtain a quick proof for $A<0$: Suppose we have the result for $A>0.$ If now $A<0,$ then $\lim_{x\to c} -f(x) = -A>0.$ Hence $\lim_{x\to c} -f(x)/g(x) = \infty.$ Therefore $\lim_{x\to c} -[-f(x)/g(x)] =-\infty.$ Since $-[-f(x)/g(x)]=f(x)/g(x),$ we're done.

0
On

Yes, your proof is technically correct. However, in order to make your proofs readable, you should consider making them more concise. Here are some minor improvements.

Suppose that $f$ and $g$ are real-valued functions defined on the closed interval $[a, b]$ and that $c \in [a, b]$ is such that

(i) $g(x) > 0$ for $x \in [a, b]$ such that $x \neq c$,

(ii) $\lim_{x \to c} f(x)$ exists in $\mathbb{R}$, and

(iii) $\lim_{x \to c} g(x) = 0$.

Let us put $A \colon= \lim_{x \to c} f(x)$ and $B \colon= \lim_{x \to c} g(x)$. Then $B = 0$ of course.

This is your weakened version of the assumptions in the problem statement, and perhaps you should make it clearer that this is the case. Otherwise, so far so good.

We study the following two cases according as $A> 0$ or $A < 0$.

Case 1. First suppose that $A > 0$.

Let $\alpha \in \mathbb{R}$ be arbitrary.

It suffices to take $\alpha>0$, and this would get rid of the absolute values later. I would also include why we care about the $\alpha$, something like "we want to find a $\delta$ such that $|x-c|<\delta$ implies that $f(x)/g(x) > \alpha$".

An edited version of the remainder of your case 1 proof:

Then as $\lim_{x \to c} f(x) = A$, there exists a real number $\delta_1 > 0$ such that $\big\lvert f(x) - A \big\rvert < \frac{A}{2}$ for all $x \in [a, b]$ such that $0 < \big\lvert x-c \big\rvert < \delta_1$. For all such $x$, we have $$ f(x) > \frac{A}{2} > 0. \tag{1} $$

Moreover, since $\lim_{x \to c}g(x) = 0$, there exists a real number $\delta_2 > 0$ such that $ 0 < g(x) < \frac{A}{2(\lvert \alpha \rvert + 1)} $ for all $x \in [a, b]$ such that $0 < \lvert x-c \rvert < \delta_2$. Therefore we must also have $$ \frac{1}{g(x)} > \frac{ 2 \big( \lvert \alpha \rvert + 1 \big) }{A} \tag{2} $$ for all $x \in [a, b]$ such that $0 < \lvert x-c \rvert < \delta_2$.

Let $\delta \colon= \min \left\{ \delta_1, \delta_2 \right\}$. From (1) and (2) above, we can conclude that for all $x \in [a, b]$ such that $0 < \lvert x-c \rvert < \delta$, we must have $$ \frac{ f(x) }{ g(x) } = f(x) \frac{1}{g(x)} > \frac{A}{2} \frac{ 2 \big( \lvert \alpha \rvert + 1 \big) }{A} = \lvert \alpha \rvert + 1 > \lvert \alpha \rvert \geq \alpha. $$

Thus we have shown that, corresponding to any given real number $\alpha$, there exists a real number $\delta > 0$ and depending on $\alpha$ such that $$ \frac{ f(x) }{ g(x) } > \alpha $$ for all $x \in [a, b]$ such that $0 < \lvert x-c \rvert < \delta$. Therefore we conclude that $$ \lim_{x \to c} \frac{ f(x) }{ g(x) } = +\infty $$ as was desired.

Now, case 2 can be greatly abbreviated: if we consider the function $h(x):=-f(x)$, then we can simply apply your proof for case 1 to deduce that $h(x)/g(x) \to +\infty$, from which the desired conclusion follows.