The variational distance is defined by, $$ V(P,Q)=\sum _{i}|p_{i} -q_{i} | $$ where $P=(p_{1} ,...,p_{n})$ and $Q=(q_{1} ,...,q_{n} )$ are discrete distributions.
It is fairly easy to see that $V$ is a metric, and in particular that it satisfies the triangle inequality. I now introduce class prior, $0<\alpha<1$, which represents the relative 'size' of the distributions, $$ V(\alpha ,P,Q)=\sum _{i}|\alpha p_{i} -(1-\alpha )q_{i} | $$ and would like to prove that it again satisfies the triangle inequality,
$$V(\alpha ,P,Q)+V(\beta ,Q,R)\ge V(\gamma ,P,R) $$ where $R$ is another distribution, $\beta$ and $\gamma$ are respective priors, and $\gamma$ is naturally, $$\gamma =\frac{\alpha \beta }{\alpha \beta +(1-\alpha )(1-\beta )} $$
I have confidence (numerically) that this triangle inequality indeed holds, but would like a formal proof. Any ideas?
But $V(\alpha, P, Q) != V(\alpha, Q, P)$ in general, therefore it would not be a metric, right? If that is true, what is the point of proving or disproving the triangle inequality?