As we know, relative entropy, or Kullback–Leibler divergence, could be used to measure the difference between two distributions.
Suppose we have two pairs of distributions, P1 vs Q1, P2 vs Q2, we can calculate the relative entropy for these two pairs of distributions, RE1 and RE2 respectively.
My question is: how to compare RE1 and RE2 ?