Why is the time-domain derivative equivalent to multiplication by frequency ($s$) in the Laplace transform?
Why is the time-domain integral equivalent to division by frequency ($\frac{1}{s}$) in the Laplace transform?
Intuitively, I thought the reason was that the frequency was a sort of "rate of change", so it was somehow equivalent to $\frac{df}{dt}$. The Laplace transform turns rate of change into a variable ($s$), and holds that rate of change constant throughout the problem, which is why algebraic manipulations in the frequency domain are possible.
Am I on the right track?
One way to remember it is that if you think of the integral in the inverse formula as a sort of sum (completely non-rigorous here):
$$ f(t) = c_1 e^{s_1 t} + c_2 e^{s_2 t} + c_3 e^{s_3 t} + \cdots $$
Here the coefficient $c_1$ is proportional to the Laplace transform at $s_1$, and so on. Then since the derivative of $e^{st}$ with respect to $t$ is $se^{st}$, you get that the coefficients in the sum are multiplied by $s$ when you take the derivative:
$$ f'(t) = c_1 s_1 e^{s_1 t} + c_2 s_2 e^{s_2 t} + c_3 s_3 e^{s_3 t} + \cdots $$
Conversely, integrating $f$ multiplies the coefficients by $1/s$. Again, this is not in any way a proof, but hopefully it gives some intuition.
Another way to think about it is that the smoothness of a function is related to its decay in the frequency domain as $s \to \infty$. Differentiating makes a function less smooth, (a twice differentiable function becomes onces differentiable, etc.) while integrating makes a function more smooth (a bounded discontinuous function becomes continuous, etc.). So it makes sense that differentiation would make a function decay slower in the frequency domain, in the sense that $1/s$ decays slower than $1/s^2$.