This question arises from an unproved assumption made in a proof of L'Hôpital's Rule for Indeterminate Types of $\infty/\infty$ from a Real Analysis textbook I am using. The result is intuitively simple to understand, but I am having trouble formulating a rigorous proof based on limit properties of functions and/or sequences.
Statement/Lemma to be Proved: Let $f$ be a continuous function on interval $(a,b)\!\subset\!\mathbb{R}$. If $\displaystyle{\lim_{x\rightarrow a+}\!f(x)\!=\!\infty}$, then, given any $\alpha\!\in\!\mathbb{R}$, there exists $c\!>\!a$ such that $x\!\in\!A\cap(a,c)$ implies $\alpha\leq f(c)<f(x)$.
Relationship with Infinite Limit Definition: At first glance, this may appear to slimply be the definition of right-hand infinite limits:
- $\displaystyle{\lim_{x\rightarrow a+}\!f(x)\!=\!\infty}$ is defined to mean: given any $\alpha\!\in\!\mathbb{R}$, there exists $\delta\!>\!0$, such that $x\!\in\!A\cap(a,a+\delta)$ implies $\alpha<f(x)$.
However, the main difference is that the result I am interested in forces an association between the "$\delta$" and the "$\alpha$" (where $\alpha=f(a+\delta)$, i.e., it forces $c\!\equiv\!a+\delta$ to be in the domain of $f$).
EDIT: The statement to be proved that I originally presented did not require $f$ to be continuous on $A$. However, this was added to address the comment and counterexample below.
EDIT #2: Again, a helpful user (@DanielFischer) commented that the statement after the first edit needed yet an additional limitation--i.e., that $A$ must also be an interval--for it to hold.
I believe I have a proof for the special case when the domain is an interval. It is interesting to me that the proof of this simple assertion is not so trivial (although if there is a simplier route to getting this result, I would be interested in finding out).
Theorem Let $f$ be a continuous function on interval $(a,b)\!\subset\!\mathbb{R}$. If $\displaystyle{\lim_{x\to a+}f(x)\!=\!\infty}$, then, given any $\alpha\!\in\!\mathbb{R}$, there exists $c\!\in\!(a,b)$ such that: (1) $\alpha\!\leq\!f(c)$, and (2) $x\!\in\!(a,c)$ implies $f(c)\!<\!f(x)$.
Proof From the definition of the infinite right-handed limit, given any $\alpha\!\in\!\mathbb{R}$, there exists $\delta\!>\!0$ (where we can assume that $a+\delta\!<\!b$) such that: $$\qquad\qquad\ 0\!<\!x\!-\!a\!<\!\delta\ \ \mbox{implies}\ \ \alpha\!<\!f(x).\qquad\qquad(*)$$ Since $a\!+\!\delta\!\in\!I$, then either: (1) $f(a\!+\!\delta)\!=\!\alpha$, (2) $f(a\!+\!\delta)\!<\!\alpha$, or (3) $f(a\!+\!\delta)\!>\!\alpha$.