Take $f:[0,\infty]\rightarrow \mathbb{R}$, Riemann integrable on every interval $[0,b]$, and such that there exist $M$, $a$, and $T$, such that $|f(t)|\leq Me^{at}$ for all $t\geq T$. Show that the Laplace transform of $f$ exists. That is, for every $s>a,$ the following integral converges: $$F(s):=\int_0^\infty f(t)e^{-st}dt. $$
Here are my thoughts.
I think I need to use integral comparison test somehow to prove that the integral converges.
It's given that there exist $M$, $a$, and $T$, such that $|f(t)|\leq Me^{at}$ for all $t\geq T$. This can be written as $$\frac{|f(t)|}{e^{at}}\leq M$$ Now for any $s>a$, $$\frac{f(t)}{e^{st}}\leq \frac{|f(t)|}{e^{at}}\leq M.$$ From here I think I should show that $\int_0^\infty \frac{|f(t)|}{e^{at}}dt$ converges and then using comparison test I can conclude $\int_0^\infty \frac{f(t)}{e^{st}}dt$ converges. How do I show that the integral converges?
You have to do it more sharp. It isn't sufficient to bound $f(t)e^{-st}$ by a constant. But you can use $|f(t)|\leq Me^{at}$ to conclude $|f(t)|e^{-st}\leq Me^{-(s-a)t}$.
Next, we use a very common trick: Assume $f\geq 0$. In that case $$ \int_0^Rf(t)e^{-st}~dt $$ is monoton increasing in $R$. Hence, it is sufficient to show that it is bounded independently of $R$. But that can be seen here: $$ \int_0^Rf(t)e^{-st}~dt=\int_0^Tf(t)e^{-st}~dt+\int_T^Rf(t)e^{-st}~dt\\\leq\int_0^Tf(t)e^{-st}~dt+\int_T^RMe^{-(s-a)t}~dt<\ldots $$ Next, we use the positive and negative part of a function. Defining $f^+(t)=\max\{f(t),0\}$ and $f^-(t)=-\min\{f(t),0\}$, we get $f(t)=f^+(t)-f^-(t)$ and $f^+,f^-\geq 0$.
Now you can use the argument above and $$ \int_0^Rf(t)e^{-st}~dt=\int_0^Rf^+(t)e^{-st}~dt-\int_0^Rf^-(t)e^{-st}~dt $$ to finish the proof.