(The set of natural numbers $\mathbb{N}$ starts at $0$ for me.)
Let $X$ denote a set, and define $X_\bot = X \uplus \{\bot\}.$ Let $\mathbf{else}$ denote the binary operation on $X_\bot$ defined as follows.
- If $x \in X$, then $x \mathop{\mathbf{else}} y = x$.
- If $x = \bot$, then $x \mathop{\mathbf{else}} y = y$.
It follows that $\mathbf{else}$ is a (usually non-commutative) associative operation with identity $\bot$. Hence, $\mathbf{else}$ makes $X_\bot$ into a monoid, that could reasonably be called the "else-monoid over $X$."
Question. Where can I learn more about the operation $\mathbf{else}$ and/or the "else-monoid"?
Here's some stuff we can do with it. If $f,g : X \rightarrow Y$ are partial functions that agree on their intersection, then there is a corresponding partial function $f \cup g : X \rightarrow Y$ by asserting that $(f \cup g)(x) = f(x)$ whenever $f(x)$ is well-defined, and that $(f \cup g)(x) = g(x)$ whenever $g(x)$ is well-defined. However, $f$ and $g$ may not agree on their intersection! Not to worry, $\mathbf{else}$ comes to the rescue. Recall that partial functions are the "same as" basepoint preserving homomorphisms. Hence it makes sense to define that $(f \mathbin{\mathbf{else}} g)(x) = f(x) \mathbin{\mathbf{else}} g(x),$ where we imagine that $f$ and $g$ return $\bot$ at points in their domain where they fail to be defined. Hence $f \mathbin{\mathbf{else}} g$ agrees with $f \cup g$ whenever the latter is well-defined; however, the advantage of the former is that it is always defined.
Here's an extremely simple application. If $a : \mathbb{N} \rightarrow \mathbb{R}$ is a sequence of real numbers, then we may wish to define the shift of $a$ as the sequence $\mathrm{Sh}(a) : \mathbb{N} \rightarrow \mathbb{R}$ defined as follows.
- $\mathrm{Sh}(a)_0 = 0$
- $\mathrm{Sh}(a)_{i+1} = a_i$
This is a linear transformation; $\mathrm{Sh}$ shows up as an important counterexample in functional analysis. In fact, thinking in terms of formal power series, $\mathrm{Sh}$ is basically multiplication by $x$. Explicitly:
$$x \cdot \sum_{i:\mathbb{N}} a_i x^i = \sum_{i:\mathbb{N}} \mathrm{Sh}(a)_i x^i$$
Okay, here's a cool observation: we can give a simpler description of $\mathrm{Sh}$ using the $\mathbf{else}$ operation. Namely:
$$\mathrm{Sh}(a)_i = a_{i - 1} \mathop{\mathbf{else}} 0$$
Essentially, this works because $\mathbf{else}$ gives an action of partial functions on total functions. That is, given a partial function $p : X \rightarrow Y$ and a total function $f : X \rightarrow Y$, we get a new total function $p \mathbin{\mathbf{else}} f$, computed pointwise. This gives an action of the monoid $(\mathbf{PartialFunctions}(X,Y),\mathbf{else},\bot)$ on the set $\mathbf{TotalFunctions}(X,Y)$. The above definition of $\mathrm{Sh}(a)$ is obtained by having the partial function $i \in \mathbb{N} \mapsto a_{i-1}$ act on the total function $i \in \mathbb{N} \mapsto 0$, thereby yielding a new total function. I wonder if there are messy arguments involving hybrid functions that can be simplified using $\mathbf{else}$.