Can the Jensen Shannon Divergence be larger than the Cosine Distance for a pair of distributions?

147 Views Asked by At

I've been looking for an example where the Jensen-Shannon Divergence is larger than the Cosine Distance, but I can't find any. I'm using Excel's Solver to look for possible solutions, but I wonder if there is a proof that says that it is impossible.

JS Divergence:

$${\rm JSD}(P \parallel Q)= \frac{1}{2}K_L(P \parallel M)+\frac{1}{2}K_L(Q \parallel M)$$

where $K_L$ is the KL Divergence: $$D_\text{KL}(P \parallel Q) = \sum_{x\in\mathcal{X}} P(x) \log\left(\frac{P(x)}{Q(x)}\right)$$

and $M$ is the average probability between $P$ and $Q$:

$$M=\frac{1}{2}(P+Q)$$

Cosine Distance:

$$\text{Cosine Distance} = 1 - \text{Cosine Similarity}$$

Cosine Similarity:

$$\text{Cosine Similarity} = {\mathbf{P} \cdot \mathbf{Q} \over \|\mathbf{P}\| \|\mathbf{Q}\|} = \frac{ \sum\limits_{i=1}^{n}{P_i Q_i} }{ \sqrt{\sum\limits_{i=1}^{n}{P_i^2}} \sqrt{\sum\limits_{i=1}^{n}{Q_i^2}} }$$

1

There are 1 best solutions below

2
On BEST ANSWER

Here is a counter-example: if you take $\textbf{P} = [0.2, 0.8]^\top$ and $\textbf{Q} = [0.05, 0.95]^\top$, you get

  • $JSD(\textbf{P}\|\textbf{Q}) \approx 0.0273$
  • $\text{Cosine Distance} (\textbf{P}, \textbf{Q}) \approx 0.0184$.