I am reading a paper which is based mostly on divergence. I tried to get a basic understanding of divergence but I cannot see how it is linked with this aspect. It says: $D(\phi,p) = \phi . \log_2\phi/p + (1-\phi)\log_2((1-\phi)/(1-p))$ is called the divergence of $\phi$ from p. I'm totally confused with this statement. Can everyone describe it in simple words to me?
EDIT: Here is the link to the paper. The statement is on top of the 10th page just before equation (8).
This is called the Kullback-Leibler divergence. It is a measure of the difference of probability distributions. It is unrelated to the divergence of a vector field.