Prove that $ f:(a,b)\to\mathbb{R}$ is integrable iff $\lim_{\epsilon\to0} \int_{[a+\epsilon,b-\epsilon]}f$ exists

478 Views Asked by At

I want to solve the following:

Let $ f:(a,b)\to\mathbb{R}$ continous such that $f(x)\ge 0 $ for all $x\in(a,b)$. Show that $f$ is integrable iff $\displaystyle \lim_{\varepsilon\to0} \int_{[a+\varepsilon,b-\varepsilon]}f$ exists.

My attempt:

$\Leftarrow]$ I want invoke the following proposition:

If $A$ is open and bounded, and $f:A\to\mathbb{R}$ is bounded and its set of discontinuities is measure zero, then $f$ is integrable.

And we have that $(a,b)$ is bounded by the one dimensional rectangle $[a,b]$ and since $f$ is continous we have that it is bounded in $(a,b)$, but the thing is that this argument does not need the limit. Can you help me fix this please?

If $f$ is integrable then we have that:

$$\sum_{\phi \in F} \phi f \to \int_{(a,b)} f = \displaystyle \lim_{\varepsilon\to0} \int_{[a+\varepsilon,b-\varepsilon]}f$$

But I think this is a little bit trivial and I think I am wrong. Can you help me verify this, and if it is wrong, can you help me fix the mistakes please?

Thanks a lot in advance :)

1

There are 1 best solutions below

9
On

The key is that $f(x) \geq 0$. If you don't have that, it isn't true:

$\int_{-1}^1 \frac{x}{1-x^2}\, dx$ does not exist (each asymptote goes like $1/x$ near $0$), but $\int_{-1+\epsilon}^{1-\epsilon} \frac{x}{1-x^2}\, dx = 0$ for all $0 < \epsilon < 1$ because the function is odd.


Note that you can't assume $f$ is bounded, or that it extends continuously to $[a,b]$. The usual definition of the Riemann integral only applies to bounded functions on closed intervals, so in this case we're looking at an integral which is potentially improper at $a$ and at $b$. We do have that $f$ is continuous on $(a,b)$, so it is integrable on any proper closed subinterval of $(a,b)$.

One definition of $\int_a^b f(x)\, dx$ for integrals which are improper at $a$ and at $b$ (only) would be, choosing some $c \in (a,b)$:

$\int_a^b f(x)\, dx = \lim_{t\rightarrow a^+} \int_t^c f(x)\, dx$ + $\lim_{s \rightarrow b^-} \int_c^s f(x)\, dx$ $\qquad(*)$

The claim is that each of those limits exists provided that $f(x) \geq 0$ and $\lim_{\epsilon\rightarrow 0^+} \int_{a+\epsilon}^{b-\epsilon} f(x)\, dx$ exists. Call that integral $I(\epsilon) = \int_{a+\epsilon}^{b-\epsilon} f(x)\, dx$.

Because $f(x) \geq 0$, we have that $I(\epsilon)$ is increasing as $\epsilon$ decreases. Therefore the limit $\lim_{\epsilon\rightarrow 0^+} I(\epsilon)$ exists if and only if the set of values $\{I(\epsilon): \epsilon > 0\}$ is bounded, in which case the limit is the supremum of these values.

It then quickly follows that if this is true, then each of the limits above in equation $(*)$ exist because they, too, are bounded and increase as $t$ decreases and as $s$ increases.


I don't have Spivak on hand, and am not certain what definitions he (and hence you) are using, but you should be able to give a proof roughly equivalent to the above using whatever definitions you have.