Most statistical measures are non-negative. Shannon entropy, $H(X)$, a statistical measure of disorder or uncertainty in probability distributions, is also non-negative, despite it having a negative sign at the front due to logarithm rules. (Probability values are non-negative too.)
The entropy of the copula density, $c(u,v)$, however, is negative, despite it having a negative sign at the front like Shannon entropy. The negative coefficient actually makes the positive value in the double integral negative in the end. $$h(c(u,v)) = -\iint_{[0,1]^2} c(u,v) \log c(u,v) \, \mathrm{d}u \, \mathrm{d}v$$
Why is it negative? and what is the intuition then when we see, for example, a Gaussian copula density with an entropy of -3.5? Is it more uncertain than a copula entropy of -2.1?
From the opposite perspective, why isn't copula entropy positive, or wouldn't it be better if copula entropy were always positive?
Ma, J. and Sun, Z. (2011), 'Mutual information is copula entropy', Tsinghua Science and Technology. 16(1), 51-54.
Not true for differential entropies.
The "entropy of copula density" is negative just like the (differential) entropy of any density restricted to $[0,1]$ is negative.