Is it possible to get 3 decision criteria using Bayes theorem?

70 Views Asked by At

I was wondering if it is possible to get 3 intersection points if use Bayes's theorem

$$P(B|x) = \frac{P(x\mid B) \times P(B)}{P(x)}$$ Where $P(x\mid B)$ is a gaussian function.$$ P(x\mid B) = \frac{1}{\sigma_b\sqrt{2\pi}} \exp\left[-\frac{1}{2} \frac{(x - \mu_b)^2}{\sigma_b^2}\right]$$

I already found it was possible to get 1 intersection if the variance is equal and the prior probability is equal. But i was wondering if it is possible to get 3 intersection points?

(to be exact only 2 functions will be compared)

Update: (I don't know how to explain it better but maybe this image helps?)

1

There are 1 best solutions below

0
On

No, your posterior distributions can only cross each other at two points. This is because they're both Gaussian, so the equation for their intersection will look like this $$\frac1{\sigma_1\sqrt{2\pi}}\exp\left[-\frac{(x-\mu_1)^2}{2\sigma_1^2}\right]=\frac1{\sigma_2\sqrt{2\pi}}\exp\left[-\frac{(x-\mu_2)^2}{2\sigma_2^2}\right]$$ We can rearrange this as follows $$\exp\left[-\frac{(x-\mu_1)^2}{2\sigma_1^2}\right]\div\exp\left[-\frac{(x-\mu_2)^2}{2\sigma_2^2}\right]=\frac{\sigma_1}{\sigma_2}$$ $$\exp\left[-\frac{(x-\mu_1)^2}{2\sigma_1^2}+\frac{(x-\mu_2)^2}{2\sigma_2^2}\right]=\frac{\sigma_1}{\sigma_2}$$ $$-\frac{(x-\mu_1)^2}{2\sigma_1^2}+\frac{(x-\mu_2)^2}{2\sigma_2^2}=\log\left(\frac{\sigma_1}{\sigma_2}\right)$$ $$\left(\frac{1}{2\sigma_2^2}-\frac{1}{2\sigma_1^2}\right)x^2+\left(-\frac{\mu_2}{\sigma_2^2}+\frac{\mu_1}{\sigma_1^2}\right)x+\left(\frac{\mu_2^2}{2\sigma_2^2}-\frac{\mu_1^2}{2\sigma_1^2}\right)=\log\left(\frac{\sigma_1}{\sigma_2}\right)$$

This is a quadratic for $x$ and so it has either 0, 1, or 2 roots.