In both signal processing and probability, one ends up with transforms with no imaginary numbers that look very much like ones with them, where the latter are more often defined even though the parallels are so strong.
Based on some probability theory I am doing, for example, it looks to me like the function
$$\int_{-\infty}^{+\infty}\frac{1}{1+\vert t\vert}dt$$
fails to converge (it evaluates in the limit to $\infty$), but
$$\int_{-\infty}^{+\infty}e^{i\omega t}\frac{1}{1+\vert t\vert}dt$$
will converge to a number. Now, since $e^{i\omega t} = \cos \omega t+ i\sin\omega t$, for and given value of $\omega$ this ends up multiplying the 'top hat' type function by separately integrable sin/cos functions that, it seems, cause there to be negative as well as positive regions so that the integral has a finite value for all $\omega>0$, I think.
This seems similar to the difference between (regular) convergence and absolute convergence in series.
Is my intuition right, and - if so - where in math is the explanation for why this works out right?
A simple reason your transform integral converges is that the function is monotone decreasing as $t\rightarrow\pm\infty$ and tends to $0$. So you can rewrite either one-sided integral on $(-\infty,0]$ or $[0,\infty)$ involving either $\sin$ or $\cos$ as an alternating series where the general term tends to $0$. The alternating series test gives convergence, and it even gives the first neglected term as a bound on the error when truncating the integral to finite interval. So the convergence is uniform in $\omega$ on any finite interval, which also leads to continuity of the resulting transform. This guarantees that you can form the transform of all kinds of functions that are not integrable.