Assume $f$ is Riemann integrable on $[a,b],$ and that $F'=f$ on $[a,b].$ Using Fundamental theorem of integral calculus, show that $\int_a^b f(x)dx=(b-a)f\{a+\theta(b-a)\}$ holds for some $\theta\in(0,1)$.
I am using the "Fundamental theorem of integral calculus" as
Let $f: [a,b] \to \mathbb{R}$ be a Riemann integrable function. If $F: [a,b] \to \mathbb{R}$ is an antiderivative of $f$, then $$\int_a^b \! f(x) \, \mathrm{d}x = F(b)-F(a).$$
How to use this theorem to show the above?
I assume the problem was supposed to be this: Assume $f$ is Riemann integrable on $[a,b],$ and that $F'=f$ on $[a,b].$ Then the conclusion follows as stated.
Proof: Let $I = \int_a^b f.$ Define
$$G(x) = F(x)-\frac{I}{b-a}x,\,\,\,x\in [a,b].$$
Note that $G'(x) = f(x) - \dfrac{I}{b-a}$ on $[a,b].$ Also note $G(b) - G(a) = 0,$ here using the version of the FTC you stated. Thus by the mean value theorem, $G'(c) = 0$ for some $c\in (a,b).$ This implies
$$0=G'(c) = f(c)-\frac{I}{b-a} \, \implies\, I = (b-a)f(c).$$
Since $c,$ like any point in $(a,b),$ can be writen as $a+\theta(b-a)$ for some $\theta \in (0,1),$ we're done.