If we define convolution to be:
$$(f_1 * f_2)(t) = \int_{-\infty}^{\infty}f_1(t)f_2(\tau -t)d\tau$$
For many functions $f$, iterated self convolution $(f*f*\cdots*f)(t)$ tends to create functions increasingly close to Gaussian - closer as the number of convolutions increase: $$g(x) = Ke^{-\frac{(x-\mu)^2}{\sigma^2}}$$
It is well known in many contexts. For example sums of Independent and Identically Distributed (IID) random variables in probability and behavior of filter in signals and systems.
What is your favorite way to explain / show this?
You may just rescale $f$ in order to make it a probability density function. If the hypothesis of the central limit theorem are fulfilled, the claim is straightforward. Of course the usual proof of the CLT deeply exploits the fact that the Fourier transform of the multiple convolution is just the $n$-th power of a Fourier transform.