Let consider $(X,Y)$ a couple of random variables with values in $\mathbb R^p \times \{0,1\}$ and a distribution $$\mathbb P(Y=k) = \pi_k>0 \text{ and } \mathbb P(X \in \text{d}x|Y=k)=g_k(x)\text{d}x, k\in\{0,1\}, x\in \mathbb R^p,$$ where $\pi_0 + \pi_1 = 1$ and $g_0,g_1$ are two probability densities in $\mathbb R^p.$
We define the cliassifier $h_*(x) = \mathbb{1}_{\{\pi_1g_1(x)>\pi_0g_0(x)\}}, x \in \mathbb R^p.$
I need to prove that the classifier $h_*$ fullfills $$\mathbb P(h_*(X)\ne Y)=\min_h\mathbb P(h(X)\ne Y).$$
I started by determining the distribution of $X$.
By the law of total probabilities, I've established that the distribution of $X$ is $$\forall A \in R^p, \mathbb P(X\in A) = \int_A(\pi_0g_0(x) + \pi_1g_1(x))\text{d}x.$$
I tried to calculate $\mathbb P(h(X)\ne Y \backslash h_*(X)\ne Y)$ but wasn't able to conclude anything.
Any help would be apreciated.
You have $$ P(h^*(X)\neq Y)=P(h^*(X)=1|Y=0)P(Y=0)+P(h^*(X)=0|Y=1)P(Y=1)= $$ $$ =\pi_0\int\limits_{\pi_1g_1>\pi_0g_0} g_0(x)\,dx+\pi_1\int\limits_{\pi_1g_1\le \pi_0g_0} g_1(x)\,dx $$ $$ =\pi_0\int\limits_{\pi_1g_1>\pi_0g_0} g_0(x)\,dx+\pi_1(1-\int\limits_{\pi_1g_1> \pi_0g_0} g_1(x)\,dx)=\pi_1+\int\limits_{\pi_1g_1> \pi_0g_0} (\pi_0g_0(x)-\pi_1g_1(x))\,dx $$ Now take any other predictor $h$. Call $R$ the subset where $h=1$. A similar computation yields $$ P(h(X)\neq Y)=\pi_1+\int\limits_{R} (\pi_0g_0(x)-\pi_1g_1(x))\,dx $$ Now observe that the last integral is minimized if $R=\{\pi_1g_1> \pi_0g_0\}$. On any other subset part of your integral will be positive.