$\newcommand{\cM}{\mathcal{M}} \newcommand{\R}{\mathbb{R}}$I'm writing about a machine learning model selection problem where we have a finite set $\cM$ of models, each having its own abstract parameter space $\Theta_M$. The learning problem induces a reward function $R_M(\theta) : \Theta_M \mapsto \R$ defining the "goodness" of a particular model and choice of parameters for that model.
Although each $R_M$ has a different domain, semantically they all compute the "goodness" for the same learning task. In some parts of the exposition, it would help to have a unifying notation for the "function" $R$ that computes:
- inputs: $M \in \cM,\ \theta \in \Theta_M$
- outputs: $R_M(\theta)$.
But it doesn't make sense to define $R : \cM \times \bigcup_M \{ \Theta_M \} \mapsto \R$ because $R(M_i, \theta \in \Theta_j)$ is undefined when $i \neq j$. Another place where difficulty arises is talking about the "universal optimization algorithm" $A$ that, given $M \in \cM$, outputs some $\theta \in \Theta_M$.
Is there a preferred way to handle $R$ and $A$ in notation? Or, an alternate way to frame the problem that avoids this issue entirely?
The domain of $R$ is an example of dependent sum, also called indexed disjoint union.
When $A$ is a set and $B_a$ is a set for any $a \in A$, the dependent sum is defined as follows: $$\sum_{a \in A} B_a = \left \{ (a, b) \in A \times \bigcup_{a \in A} B_a : b \in B_a \right \}$$
In your case, you can write: $$R \colon \sum_{M \in \mathcal M} \Theta_M \to \mathbb R$$ Then $R(M, \theta)$ is defined precisely when $M \in \mathcal M$ and $\theta \in \Theta_M$.
Be aware that this notation, which is common in type theory and category theory, might not be known to your readers. So I would suggest that you define it clearly before you use it.