Let $S_{\text{KL}}(\mu,\nu)=D_{\text{KL}}(\mu.\nu)+D_{\text{KL}}(\nu.\mu)$ be the symmetric KL divergence for the discrete measures $\mu,\nu$. We assume that for some $\nu'\neq \nu,\mu$ and $\mu'\neq\mu,\nu$ it holds that $$S_{\text{KL}}(\mu.\nu)=S_{\text{KL}}(\mu',\nu').$$ We define $$\mu K\triangleq \mu \circledast K $$ where $\circledast$ is the convolution operator, and $K$ is an arbitrary measure (with discrete set as support). Under which condition is the following statement true: $$S_{\text{KL}}(\mu,\nu)=S_{\text{KL}}(\mu',\nu')\implies S_{\text{KL}}(\mu K,\nu K)=S_{\text{KL}}(\mu'K,\nu'K)$$?
On the other hand, given the pair $\mu,\nu$ assume that I can choose $\nu',\mu'$ such that: $$S_{\text{KL}}(\mu',\nu')\leq\epsilon\leq S_{\text{KL}}(\mu,\nu)$$ for some $\epsilon>0$. Let $K$ be an arbitrary distribution as previously. What is the the appropriate choice of $\mu',\nu'$ such that: $$S_{\text{KL}}(\mu,\nu)\leq\epsilon\leq S_{\text{KL}}(\mu',\nu')\implies S_{\text{KL}}(\mu K,\nu K)\leq S_{\text{KL}}(\mu'K,\nu'K)$$?