As of July 5 2019, Wikipedia states that "Unlike the mutual information, however, the variation of information is a true metric, in that it obeys the triangle inequality." (https://en.wikipedia.org/wiki/Variation_of_information). Since it's defined as $(X;Y)=H(X|Y)+H(Y|X)$, however, it seems that it would be zero when one event completely determines another. I was under the impression that it's possible for one event to completely determine another without them both being the exact same event, in which case the variation of information would in fact be a pseudometric.
Am I missing something, or am I just taking the phrase "true metric" too literally, and the sentence should be read as stating that variation of information is more metric-like than mutual information?
The property that only identical points are 0 distance apart in a metric space can easily be removed from the requirement of a metric resulting only in mild differences to the general theory. In modern books this axiom is beginning to disappear. The triangle axiom on the other hand is much more substantial. The sentence you mention seems to draw attention that variation of information satisfies the triangle inequality, and is thus a true metric in that sense, unlike mutual information.