Upper bound for Bayes error

503 Views Asked by At

The Bayes decision for the two classes classification problem results in the Bayes error $$P(\mathrm{error})=\int P(\mathrm{error} \, \vert \, x) p(x) \, \mathrm{d}x$$ where $P(\mathrm{error} \, \vert \, x)=\mathrm{min}\bigl\{P(\omega_1 \, \vert\, x), \, P(\omega_2 \, \vert \, x)\bigr\}$ is the probability error for a particular input $x$.

Now I want to find an upper-bound for the error such that $$P(\mathrm{error}) \leq \int \frac{2}{\frac{1}{P(\omega_1 \, \vert\, x)}+\frac{1}{P(\omega_2 \, \vert\, x)}}p(x) \, \mathrm{d}x.$$

My approach is

\begin{align} P(\mathrm{error})&= \int \mathrm{min}\bigl\{P(\omega_1 \, \vert\, x), \, P(\omega_2 \, \vert \, x)\bigr\} p(x) \, \mathrm{d}x \\ & \leq \int \frac{P(\omega_1 \, \vert\, x)+ \, P(\omega_2 \, \vert \, x)}{2}p(x) \, \mathrm{d}x \\ &=\int \frac{{1}}{{2 \over P(\omega_1 \, \vert\, x)+ \, P(\omega_2 \, \vert \, x) }}p(x) \, \mathrm{d}x \end{align}

From here I don't know how this leads me to my desired inequality. Any hints?

1

There are 1 best solutions below

0
On BEST ANSWER

Forget about probabilities and integrals. All you have to prove is that, for any $a,b>0$

$$min(a,b)\le \frac{2}{1/a+1/b}= \frac{2 ab}{a+b}$$

(For the sake of this problem, we might simplify this a little because in our case $a+b=1$ ; but the property is more general).

Suppose $a<b$. Then $$a+b<2b \implies \frac{1}{a+b}> \frac{1}{2b} \implies \frac{2ab}{a+b}>a=min(a,b)$$

I'll left to you the cases $a=b$ and $a>b$... (and $a=0$ or $b=0$)