I am trying to find all functions f satisfying $f'(t)=f(t)+\int_a^bf(t)dt$.
This is a problem from Spivak's Calculus and it is the chapter about Logarithms and Exponential functions. I gave up and read the solution (which I quickly regretted, but at the same time realized that I had not carefully read one very important theorem* in the text) to find that it begins with:
We know $f''(t)=f'(t)$.
How do we know this? Also, in general, how would you have approached this problem? Any solution with your thoughts written out would be very appreciated. I am not interested in the solution, but the thought process behind this.
*For the curious, the theorem was that:
If $f$ is differentiable and $f'(x)=f(x)$ for all $x$ then there is a number $c$ s.t. $f(x)=ce^x$ for all $x$.
How do we know that $f''(x) = f'(x)$? Differentiate both sides of $$f'(x) = f(x) + \int_a^b f(t)\,dt.$$ Remember: $\int_a^bf(t)\,dt$ is a number (the net signed area between the graph of $y=f(x)$, the $x$-axis, and the lines $x=a$ and $x=b$). So what is its derivative?
Why would you do this? Because that integral is somewhat annoying: if you just had $f'(x) = f(x)$, then you would be able to solve the differential equation simply enough (e.g., with the theorem you have). But since all that is standing in our way is a constant that is adding, differentiating should spring to mind: that will get rid of the constant, and just "shift" the problem "one derivative down" (to a relation between $f''(x)$ and $f'(x)$).
Once you know that $f''(x) = f'(x)$, let $g(x) = f'(x)$. Then we have $g'(x)=g(x)$, so the theorem applies to $g(x)$ (exactly what we were hoping for). And you go from there.
Added. As both Didier Piau and Robert Israel point out, it's
probablydefinitely bad practice to use the same letter as both an actual variable and the variable of integration (sometimes called the 'dummy variable'). Though I see from looking at my copy of Spivak that this originated in the text and not with you.