Consider a finite alphabet $\mathcal{X}$ and probability distributions $p(\mathrm{X})$ and $q(\mathrm{X})$ respectively. The relative entropy variance is defined as
$$V(p \| q):=\operatorname{Var}\left(\log \frac{p(\mathrm{X})}{q(\mathrm{X})}\right)=\sum_{x \in \mathcal{X}} p(x) \cdot\left(\log \frac{p(x)}{q(x)}-D(p \| q)\right)^2,$$
where $D(p\|q)$ is relative entropy or KL divergence and given by
$$D(p\|q) = \sum_{x\in\mathcal{X}}p(x)\log\frac{p(x)}{q(x)}.$$
Now consider two finite alphabets $\mathcal{X}$ and $\mathcal{Y}$ respectively and random variables $\mathrm{X}$ and $\mathrm{Y}$. Let $W_{\mathrm{Y|X}}: \mathrm{X}\rightarrow \mathrm{Y}$ be a channel, which is some conditional probability distribution. For given input distribution $p_{\mathrm{X}}$, we have a joint distribution $p_{\mathrm{XY}} = W_{\mathrm{Y|X}}p_{\mathrm{X}}$ and the marginal on the output $p_{\mathrm{Y}}(y) = \sum_{x\in\mathcal{X}} p_{\mathrm{XY}}(x, y)$. Let us define
$$V(p, W) := V(p_{\mathrm{X}}W_{\mathrm{Y}|\mathrm{X}}\|p_{\mathrm{X}}\times p_{\mathrm{Y}})$$
Is $V(p, W)$ always finite and can it be upper bounded in some way? It would be inspired by the bound below:
$$D(p, W) = D(p_{\mathrm{X}}W_{\mathrm{Y}|\mathrm{X}}\|p_{\mathrm{X}}\times p_{\mathrm{Y}}) = I(X:Y) \leq \log|\mathcal{X}|$$