For two random variables X and Y, let us consider the following quantity $$I'(X,Y)=\mathrm{JS}(p(X,Y)||p(X)p(Y)),$$ where $\mathrm{JS}(\cdot || \cdot)$ denotes the Jensen–Shannon (JS) divergence.
This quantity is similar to the mutual information defined by the KL divergence, and $I'(X,Y)=0$ if X and Y are independent. Is there a name for this quantity or any related analysis on it?
This measure is defined in a 2019 paper* as Jensen-Shannon Mutual information (JSMI). They present it as something they propose, so at least according to their knowledge, it wasn't used prior.
$\ast$ Goldberger, Jacob, and Yaniv Opochinsky. "Information-bottleneck Based on the Jensen-Shannon Divergence with Applications to Pairwise Clustering." IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2019.