I was studying a particular machine learning algorithm (the GeoGAN) and although this isn't mentioned in the paper, it seems to be that under certain conditions, the algorithm is maximizing (over $p$) the following function between probability density functions $p$ and $q$: $$D(p, q) = \int_{-\infty}^{\infty} \min(p(x), q(x)) dx$$
This appears to be measuring some sort of divergence between $p$ and $q$: when $p=q$, it reaches its maximum possible value of 1, and when $p$ and $q$ have disjoint supports, it reaches its minimum possible value of 0.
Is there a name for this metric, or is it similar to any known divergences? I couldn't find anything that seemed related on Wikipedia. Has anybody studied it or could point to any places I could learn more about it?
This measure is related to the histogram intersection distance and can be rewritten as $$ D(p,q)=1-\frac{1}{2}\int_{-\infty}^{\infty}|p(x)-q(x)|dx. $$ $D(p,q)$ is a similarity measure and its maximization corresponds to the minimization of the $L^1$ distance between $p$ and $q$ (when $p$ and $q$ have disjoint supports, $D(p,q)=0$ and not $2$).