Can information-theory quantify deviations from bijectivity?

193 Views Asked by At

There are basic ways to qualitatively classify deviations from bijectivity of a function $f: x \to y$, e.g. non-injective, non-surjective, non-existence of an inverse: more generally non-monomorphic, non-epimorphic.

Are there two "natural" quantitative measures of deviation from injectivity and surjectivity?

Is there one natural quantitative measure of "total deviation" from bijectivity (combining both injective/surjectivity violation)? And from isomorphic?

Brain storm: Relative entropy of $\{f^{-1}(y) \}$, i.e. the preimages of $f$, seems one relevant to measuring relative injectivity? Relative measure or cardinality $|Im(f)/Cod(f)|$ seems one way to measure surjectivity? These both invoke additional concepts, e.g. probability or measure. I am sure mathematicians will have clearer and better ways that I can't think of. Any ideas or references will be most welcome.

The above mostly relates to functions between sets. How does one ask and answer the corresponding questions for structure-preserving functions, i.e. functorial functions, such as monotone, equivariant, homomorphic, continuous? (Apologies if this latter is asking too much in one question!).

Thanks!

1

There are 1 best solutions below

0
On

Not a metric but, the following simply allows you to partially order functions by injectivity.

There is a partial order on the inverse image equivalence relations induced by the set of all functions $\{f|f:A \to B\}$, i.e. where each class of the relation or quotient on domain $A$ induced by $f$ is defined by $ x\equiv_fy$ iff $f(x)=f(y)$, for $x,y \in A$. The partial order encodes how coarse or fine the equivalence relations are, relative to one another, which in turn encodes nothing but the relative injectivity. A limitation is that this is not a total order: two "equally injective" functions may be incomparable and you would never know.

In terms of a numeric measure, for functions between finite sets, I guess you can also just calculate the entropy on the family of cardinals $(|f^{-1}(y)|)_{y \in B}$, treating the latter as a probability distribution (normalized frequency histogram). Not sure currently how this generalizes.