The concept of Shannon entropy as a basic measure of uncertainty for a random variable was introduced by Shannon. Let X be an absolutely continuous non-negative random variable having probability density function (pdf) f and cumulative distribution function (cdf) F. In Reliability Theory, X represents the lifetime of a system or a living organism. Shannon entropy for this kind of random variables is named differential entropy and is defined as follows: $$H(X)=-\int_0^\infty f(x)\log f(x)dx.$$
Recently, another measure of uncer- tainty, known as extropy, was proposed by Lad et al. 2015 as a dual measure of Shannon entropy. For a non-negative random variable X the extropy is defined as below:
$$J(X)=-\frac{1}{2}\int_0^\infty f^2(x)dx.$$
Here it has written that extropy is the alternate way of measuring the uncertainty. Then why it is written that extropy is the dual complement of entropy?
What does mean by dual complement here?