Bound on error probability of 1-D ideal Stoller split

39 Views Asked by At

Consider the binary decision rule $g_c: \mathcal{X} \rightarrow \{0, 1\}$ given by:

$$g_c(x) = \begin{cases} 1& \text{if } x \geq c\\ 0 & \text{otherwise} \end{cases}$$

Show that the minimum probability of error satisfies:

$$\min_c\mathbb{P}(g_c(X) \neq Y) \leq \frac{4p(1-p)}{1+p(1-p)\frac{(m_0-m_1)^2}{(1-p)\sigma_0^2+p\sigma_1^2}}$$

where $p$ is the class probability, $m_i, \sigma_i^2$ are the class conditional mean and variance.

What I've tried

Rewriting the expression above:

$$ \begin{align} &= \mathbb{P}(X-c \geq 0, 2Y-1=-1) + \mathbb{P}(X-c < 0, 2Y-1=1) \\ &= \mathbb{P}(\left |u(X-c) - (2Y-1) \right|\geq 1) \end{align} $$

which is true for all $u > 0$. Applying Chebychev's inequality:

$$ \leq \mathbb{E}[(u(X-c) - (2Y-1))^2]$$

simplifying this by conditioning on $Y$:

$$ = (1-p)(u^2(\sigma_0^2 + m_0^2 + c^2 - 2cm_0) + 2u(m_0-c)) + (p)(u^2(\sigma_1^2 + m_1^2 + c^2 - 2cm_1) + 2u(m_1-c))$$

but then I'm having trouble minimizing this bound. I've also tried using software to minimize this expression in terms of $c, u$ but it does not find a solution, so I'm pretty confident I've made a mistake in my simplification above.