Is there a standard term for the log cardinality, or entropy, of the pre-image of an element with respect to a function?

54 Views Asked by At

Suppose $h: X \to Y$ where $X$ and $Y$ are finite, i.e., $|X|, |Y| < \infty$. Is there a standard name for the quantity:

$$S_h(y) \equiv \log_2 |\text{Pre-image}_h(y)|?$$

For example, if the preimage of $y=0$ is {0} and the pre-image of $y=1$ is {-1, 1}, we would have $S_h(0) = 0$ bits, and $S_h(1) = 1$ bit. I.e. $S_h(y)$ is just the size of the pre-image of $y$ in log units.

From what I understand "topological entropy" and "pre-image entropy" as defined in the literature are both functions of the map $h$ as a whole. What I am looking for is a term for the entropy of the preimage of a single element $y$ (with respect to $h$). "Local pre-image entropy" seems fitting but I haven't seen it used anywhere.

Note that the quantity $S_h(y)$ is not with respect to any probability measure over $X$, it just refers to the log cardinality of the pre-image (although I guess this is equivalent to assuming a uniform distribution over $X$).

1

There are 1 best solutions below

0
On

I have not seen a specific name used for this function, but it seems "the Hartley function of the function $h$" could be a good name. Indeed, the use of a logarithmic scale for information is often attributed to Hartley (after his paper "Transmission of Information", available at http://www.dotrose.com/etext/90_Miscellaneous/transmission_of_information_1928b.pdf), and the phrase "Hartley function" seems to be in use (see e.g. https://planetmath.org/hartleyfunction, https://www.planetmath.org/derivationofhartleyfunction, or Klir & Yuan's book Fuzzy Sets and Fuzzy Logic, p.247). Thus the Hartley operator is defined by:

$$\mathcal{H}: [X\to Y] \to [Y\to \mathbb{R}],\,\, f\mapsto [y\mapsto \log_2(\#(f^{-1}(y)))].$$


Without the logarithm, the function is a special case of the transgression (or Ruelle-Perron-Frobenius, or transfer, or integration-along-the-fibers) operator:

$$\mathcal{T}: [X\to Y]\times [X\to \mathbb{R}]\to [[X\to \mathbb{R}]\to [Y\to\mathbb{R}]],$$

$$(f,\rho)\mapsto \left[\phi\mapsto \left[y\mapsto \sum_{x\in f^{-1}(y)}\rho(x)\phi(x)\right]\right].$$

Then $\mathcal{T}(f,1)(1)(y)=\#(f^{-1}(y))$. (see e.g. https://ncatlab.org/nlab/show/transgression, Why is the Ruelle Transfer Operator a bounded operator?, or Exr.1 on p.86 of Kostrikin & Manin's Linear Algebra and Geometry)


Added: See also Aczél & Daróczy's On measures of information and their characterizations, pp.120, 148, who also use the term "Hartley entropy" or "Rényi entropy of order $0$". Rényi himself in Foundations of Probability, p.23 uses the term "qualitative entropy". Thus it seems one can use the phrase "Hartley preimage entropy" etc. for the function at hand.