Consider a 2-class Pattern Recognition problem with feature vectors in $R^2$. The class conditional density for class-I is uniform over $[1, 3]×[1, 3]$ and that for class-II is uniform over $[2, 4] × [2, 4]$.
Now I have two questions.
- Suppose the prior probabilities are equal. In such a case, the Bayes classifier is given by $x + y = 5$.
- If the prior probabilities are changed to $p1 = 0.4$ and $p2 = 0.6$, what is the Bayes classifier now?
I solved the first part kinda "graphically" and intuitively, but I don't know how to solve the second one. Any help, hint or a solution, will be appreciated. Thanks!
"Analytically" you can proceed using Bayes theorem .
Using graphical intuition , note that the non-uniform priors "favor" a class over the other , so the points at the intersection have now a higher probability to belong to the second class .