Let $s = (s_1,...,s_n) \in (0,1)^n$. Suppose that $x$ is a real number and that $x \in (\min_j s_j, \max_j s_j)$. It's not difficult to show, by a direct construction, that
There exists a probability vector $p = (p_1,...,p_n) \in (0,1)^n$ such that $s \cdot p = x$,
where $s \cdot p$ is inner product, and $p$ is a probability vector if $\sum_j p_j = 1$.
I'm wondering if this result can be demonstrated using a separating hyperplane argument. Here's the idea.
Suppose for contradiction that no such $p \in (0,1)^n$ exists. Then, with $X = \{p \in (0,1)^n: p \ \text{a probability vector} \}$ and $Y = \{p: s \cdot p = x\}$, we have $X \cap Y = \emptyset$. Both sets $X$ and $Y$ are convex. So, by the separation theorem, there is a vector $q \in \mathbb{R}^n$ and $b \in \mathbb{R}$ such that $q \cdot p \leq b$ for all $p \in X$ and $q \cdot p \geq b$ for all $p \in Y$. I don't see a way to derive a contradiction from this, however, so any help here would be appreciated.
I would also like to generalize this argument, if it can be made to work, to the countably infinite case, i.e. replacing $n$ with $\mathbb{N}$. Any suggestions about how to achieve that would also be appreciated.
Lemma: $q$ and $s$ are parallel. Geometrically this is somewhat obvious(?) as $Y$ is an $(n-1)$-dimensional hyperplane orthogonal to $s$, in $n$-space, so any other hyperplane wholly on one side of $Y$ must be orthogonal to the same $s$. The proof is a little tedious and is at the end.
Assuming the Lemma, then WLOG we can assume $q=s$, which implies $b \le x$ in order for satisfy $\forall p \in Y: q \cdot p \ge b$.
Now construct $p$ to be the probability vector with $1-\delta$ at the $\max_j s_j$ position, and ${\delta \over n-1}$ elsewhere, for some small $\delta > 0.$ This $p \in X$ (i.e. $p_j \in (0,1)$ and $\sum_j p_j = 1$), and $s \cdot p > (1-\delta) \max_j s_j$ with the inequality being strict because all the other terms of the dot product are strictly positive since ${\delta \over n-1} > 0, s_j > 0$.
Meanwhile, $x$ is a given constant, and $x < \max_j s_j$ implies we can choose $\delta > 0$ s.t. $(1 - \delta) \max_j s_j > x$. Thus, $s \cdot p > (1 - \delta) \max_j s_j > x \ge b$. This contradicts the separating hyperplane property of $\forall p \in X: s \cdot p \le b$.
Comments: (1) Sorry this still required a construction. :) (2) I think you needed to define $X$ not simply as $(0,1)^n,$ but rather as $X=\{p: p \in (0,1)^n \text{ and } p \text{ is a probability vector}\}$.
Proof of Lemma: Decompose the hyperplane's $q$ into parallel and orthogonal components, i.e. $q = as + t$ for some real $a$ and some $t \perp s$ (i.e. $s \cdot t = 0$). By definition of $Y$, for any real $c$, we have $y = {x \over |s|^2} s + c t \in Y$ since $s \cdot y = x + 0 = x$. Meanwhile, $q \cdot y = ax+c|t|^2$. Since $a, x, b$ are all constants while $c$ is any real number, the only way we can have $q \cdot y = ax+c|t|^2 \ge b$ always, is if $|t|^2=0$, i.e. $t=0$, i.e. $q = as$, i.e. $q \parallel s$. QED