This is a question about the convention. I am confused about whether a symbol means a specific and fixed number or any number. Here is a recent example on the definition of margin for perceptrons (in machine learning).
Let $\{(x_1, y_1), (x_2, y_2)...\}$ be the training data. Assume that $||x_t||=1$ for all $t$. The data set is linearly separapable with margin $\delta$ if there exists unit vectors $v_1, v_2$ such that for each example $(x_t, y_t)$,
$$v_1^Tx_t - v_2^T x_t \ge 2 \delta$$.
My problem here is that this definition does not say anything about $\delta$ being a specific number or all possible numbers.
If I take it that we want all possible $\delta$ to satisfy the above, we can let $\delta_m = \max \delta$ and such that
$$v_1^Tx_t - v_2^T x_t \ge 2 \delta_m$$
It follows from the above that $v_1=-v_2$.
But I am not sure what it means.
It may be because it is not essential for $\delta$ to be unique for the definition to be useful for whatever it is intended for. Indeed, it seems to be the case here. In other words, the definition doesn't mind if the same data set is linearly separable with different margins.
It's like the case of, say, the definition for an upper bound of a set. A real number $\beta$ is said to be an upper bound for a set of real numbers $\{x\}$ provided $x\leq \beta.$ Clearly, it is similarly not essential here that $\beta$ be uniquely defined.