Knowing the definite integral implies knowing the indefinite integral

81 Views Asked by At

Spivak states the following (from page 370 to 371 in fourth edition of Calculus): "If we can find $\int_{a}^{b}f(x)dx$ for all $a$ and $b$, then we can certainly find $\int f(x)dx$." In other words knowing the indefinite integral for all possible $a$ and $b$ implies we have an anti-derivative of $f$ - a function $F$ such that $F'=f$. But how do you prove this (I'm not sure how to even show that such a $F$ is differentiable)?

The example he provides is the following: Since for all $a, b$ $$\int_{a}^{b}\sin^{5}x \cos x \, dx = \frac{\sin^{6} b}{6} - \frac{\sin^{6} a}{6}$$ then this implies $$\int \sin^{5}x \cos x \, dx = \frac{\sin^{6} x}{6}$$ The example makes sense, but why does it apply to $f$ in general?

I always thought you went from using indefinite integrals to evaluate definite integrals, so I'm confused as to why we can go backwards.

1

There are 1 best solutions below

3
On BEST ANSWER

Remember that one of the basic steps in proving the Fundamental Theorem of Calculus is that if you have a continuous function $f$, then the function $G$ defined as $G(t)=\int_a^tf(x)dx$ is an antiderivative of $f$.

How was it proved? You form the difference quotient for $G'(t)$: $$ G'(t)=\lim_{h\to0}\frac{G(t+h)-G(t)}h=\lim_{h\to0}\frac{\int_a^{t+h}f(x)dx-\int_a^tf(x)dx}h =\lim_{h\to0}\frac{\int_t^{t+h}f(x)dx}h $$ and how do you look at that? If you want to think of the integral as measuring area, then you’re dividing the area of a thin strip that’s nearly $f(t)$ high and $h$ wide, by $h$.

And there you are.