Given a probability distribution, is there an easy way to determine whether it can be expressed as the distribution of the difference between two iid random variables? In other words (working with CDFs here), is there a more simple way to characterize the set of increasing functions $F$ with $F(-\infty)=0$ and $F(\infty)=1$ such that $P(x-y\le z)=F(z)$ is true for all $z$ where $x,y$ are drawn from the same distribution with CDF $G(t)$? For example, the probability distribution with PDF $(1-|z|)\mathbf 1_{z\in[-1,1]}$ and CDF $-\frac{z|z|}{2}+z+\frac{1}{2}$ for $z\in[-1,1]$ would be in this set since it is the difference of two standard uniform random variables. Also all normal distributions with mean $0$ are in this set, since they can be represented as the difference of two other normal distributions.
I think that it might be every symmetric probability distribution that has the maximal value at $0$, but I'm not sure if additional requirements are needed (or even if the maximal value at $0$ requirement is needed).
Edit: working with characteristic functions, let the characteristic function of the distribution with CDF $F$ be $\psi(t)$. Then it must be true that $\psi(t) = \varphi(t)\cdot \varphi(-t)=\varphi(t)\overline{\varphi(t)}=|\varphi(t)|^2$ for some complex-valued $\varphi(t)$ that must be a characteristic function. This also means that $\psi(t)\in\mathbb{R}$, $\psi(t)$ is even, and $\psi(t)\ge0$. The first two conditions just mean that it's symmetric about $0$ (which is simple to get from the formulation of the problem), but I don't know what the last indicates about $F$.
Edit 2: Altogether, $\psi:\mathbb{R}\to[0,1]$ must be continuous and even, have $\psi(0)=1$, be a positive definite function (are all functions satisfying the previous requirements positive definite? - edit: the answer is no, with counter-example given by $f(0.1)=0.8$, $f(0.2)=0.9$, $f(0.3)=0.3$), and admit a continuous positive definite function $\varphi:\mathbb{R}\to\mathbb{C}$ such that $|\varphi(t)|=\sqrt{\psi(t)}$ and $\varphi(0)=1$. Many of these are just the requirements for being the characteristic function of some probability distribution, although some are specific to this problem.