I am trying to prove the following result:
Let $X$ be a subset of $\mathbb{R}^n$ and $H=\{x \in \mathbb{R}^n:\langle x,u\rangle=\alpha\}$ be a hyperplane with normal vector $u$ such that $X \subset H^-$, where $H^-=\{x \in \mathbb{R}^n:\langle x,u\rangle \leq \alpha\}$. Then \begin{align*} \text{conv}(X) \cap H = \text{conv}(X \cap H), \end{align*} where $\text{conv}$ is the convex hull.
One direction is straightforward: we have $X \cap H \subset \text{conv}(X) \cap H$, thus $$\text{conv}(X \cap H) \subset \text{conv}(\text{conv}(X) \cap H) = \text{conv}(X) \cap H $$
since $\text{conv}(X) \cap H$ is convex.
For the other direction, I'm not sure how to use the condition $X \subset H^-$ to obtain the reverse inclusion. For example, we have that
\begin{align*} \text{conv}(X) \cap H = \bigcap_{\substack{X \subset C, C \text{ convex}}} C \cap H. \end{align*}
Since $X \subset H^-$, $$\text{conv}(X) \cap H = \bigcap_{\substack{X \subset C \subset H^-, C \text{ convex}}} C \cap H.$$
Now $C \cap H$ is convex, and $X \cap H \subset C \cap H$, but I can't reach the desired conclusion.
If $x\in \text{conv}(X)\cap H$, we can write $x=\sum_{i=1}^n w_ix_i$ for $x_i\in X$ and $w_i\in[0,1]$ with $\sum_{i=1}^n w_i=1$. Since $x\in H$ and by linearity, we have $$\alpha=\langle x,u\rangle=\sum_{i=1}^n w_i\langle x_i,u\rangle\leqslant \alpha\sum_{i=1}^n w_i=\alpha.$$ If any of the inequalities $\langle x_i,u\rangle\leqslant \alpha$ is strict, then the inequality on the sum will be strict, which is a contradiction. So it must be the case that $\langle x_i,u\rangle=\alpha$ for each $i$, which means $x_i\in H$ for all $i$.