Looking for an approach to mathematical notation wherein the universe is divided into disjoint worlds.

618 Views Asked by At

Is there a rigorous approach to mathematical notation wherein the "universe" is divided into disjoint "worlds," and the meaning of notation is world-dependent? This would solve a few pesky problems. Ultimately, I'm not looking for a hand-waivy solution, but rather something that is near computer readable.

Here's three examples of pesky issues I'd like solved in a rigorous manner. Note that many of the following issues only arise if we use a set-theoretic foundations, because in such a foundations everything is a set. However, I believe that similar issues would arise in any foundations.

  1. If $f$ and $g$ are functions mapping $X \rightarrow Y$ and $*$ is a binary operation on $Y$, I would want $f * g$ to equal $x \mapsto f(x)*g(x)$. So for example, the expression $f \cup g$ should equal $x \mapsto f(x) \cup g(x)$. Of course, if $f$ and $g$ are viewed as sets of ordered pairs, then $f \cup g$ already has a meaning, so we get a conflict. This conflict goes away if we view functions as existing in a different "world" to sets, so that the same notation can mean different things.
  2. If $X$ and $Y$ are random variables, I would want $(X,Y)$ to denote the random variable $\omega \mapsto (X(\omega),Y(\omega))$. However, $(X,Y)$ already has a meaning (its an ordered pair), and thus we get a conflict. One possible solution would be to view $X$ and $Y$ as belonging to a world where $(*,*)$ is defined differently to normal.
  3. It is common to write $f(X)$ as shorthand for $\{f(x)\,|\,x \in X\}$. However, if natural numbers are constructed in the manner of Von Neumann, then for example $2 = \{0,1\}$, thus $f(2)=\{f(x) \,|\,x \in 2\}=\{f(x) \,|\,x \in \{0,1\}\}=\{f(0),f(1)\}$, which is probably not what the writer meant by $f(2)$. To avoid this, natural numbers should be viewed as living in a world that is disjoint from the world of sets (that is, a natural number should not be identified with its Von Neumann encoding).

So to reiterate, I'm looking for an approach to mathematical notation wherein the universe is divided into disjoint worlds and notation is world-dependent.

EDIT: Caveman in his answer suggests that the simply-typed lambda calculus solves the problem. If anyone knows of a gentle introduction to this field, please leave a comment.