Supporting hyperplane for Bayes boundary of a convex set

149 Views Asked by At

I was reading the book 'Optimal Statistical Decisions' by DeGroot. I came across the following claim, without proof:

G is a convex set in $R^k$ that is the convex hull of a finite number of points. A point y ∈ G belongs to the Bayes boundary of G if there is no point x ∈ G such that for every i = 1, . . ., k, $x_i$ < $y_i$. Then if y is on the Bayes boundary of G, then there exists a supporting hyperplane ⟨a, x⟩ = c to G at y such that a ≥ 0. (Assuming a is such that for every z ∈ G, ⟨a, z⟩ ≥ c.)

I tried proving it, and while it seems obvious to realise, I am not able to prove it formally. Could someone help me with it?

1

There are 1 best solutions below

0
On

Suppose $y$ is on the Bayes boundary of $G$. The set $L = \left\lbrace x \colon x_{i} < y_{i} \forall i \right\rbrace$ is convex and disjoint from $G$. The Separating Hyperplane Theorem shows that there is a vector $p$ such that $p^{T}x \leq p^{T} z$ whenever $x \in L$ and $z \in G$. The point $y$ is in the closure of $L$ and $G$, and so we infer that $z \mapsto p^{T}z$ is minimized over $G$; that is, the hyperplane $\lbrace x \in \mathbb{R}^{k} \colon p^{T}x = p^{T}y\rbrace$ supports $G$ at $y$. The only missing part is to show that $p \geq 0$. If $p \geq 0$ were false, then we could define $y^{\prime}$ by choosing $$ y_{i}^{\prime} = \begin{cases} y_{i} - 1 \quad \text{if $p_{i} < 0$}, \\ y_{i} - \varepsilon \quad \text{if $p_{i} \geq 0$}, \end{cases} $$ for all $i$. If $\varepsilon$ is sufficiently small, then $p^{T}y^{\prime} > p^{T}y$. However, by construction, $y_{i}^{\prime} < y_{i}$ for every entry $i$, contradicting the fact that $L$ and $G$ are separated by $p$.