I am interested in relating the asymptotic behavior of a function $f(t)$ for large values of $t$ with the asymptotic behavior of its Laplace transform $\hat{f}(s)$ for small values of $s$. In practice $f(t)$ will often be a probability density function.
Classic results such as Watson's lemma relate $f(t)$ for $t\rightarrow \infty$ and $\hat{f}(s)$ for $s\to 0$. My question is: assuming $\hat{f}(s)$ is defined for all $s\in \mathbb{R}$, how can I relate $f(t)$ for $t\rightarrow \infty$ and $\hat{f}(s)$ for $s\to -\infty$? What are the conditions on $f(t)$ for $\hat{f}(s)$ to have a prescribed asymptotic behavior as $s\to -\infty$? Reciprocally, how does the behavior of $f(t)$ at plus infinity constraint the behavior of $\hat{f}(s)$ at minus infinity?