I am trying to compute the normalized information distance (statistical correlation between two random variables)
Normalized information Distance formula
H(X), H(Y ), and H(X, Y ) denote the entropy of X, the entropy of Y, and the joint entropy of {X, Y }, respectively. It is a true metric and hence follows the non-negativity property.
But when I try to compute the value for this metric, I am getting my value to be below 0. I am not sure how to proceed with this
- What does it mean to have a negative value for a matric that can only be between 0 and 1
- I have checked my computation of NID and it seems to be correct to me