Posterior Probabilities derivation

92 Views Asked by At

I have a problem understanding the posterior probabilities and Bayes rule. First of all my book in communications has shown this formula:

$$P(\textbf{s}_{m}|\textbf{y})=\frac{f(\textbf{y}|\textbf{s}_{m})P(\textbf{s}_{m})}{f(\textbf{y})}$$ where $$f(\textbf{y})= \sum_{m=1}^{M}f(\textbf{y}|\textbf{s}_{m})P(\textbf{s}_{m})$$ and $\textbf{y}$ is the received information vector, $\textbf{s}_{m}$ is the transmitted signal vector (out of m possible vectors)

What I do not understand is how is possible to be mixing pdf's with discrete probabilities and on the top of that conditional probabilities. Could anybody give me a slightly more mathematical derivation? Could I say that

$$P(\textbf{y})=\int f(\textbf{y})dy$$ and likewise $$P(\textbf{y}|\textbf{s}_{m})=\int f(\textbf{y}|\textbf{s}_{m})dy$$ then I could end up using the Bayes'rule and my formula above would follow because I can equal what is inside the integral? That is $$P(\textbf{s}_{m}|\textbf{y}) P(\textbf{y})=P(\textbf{y}|\textbf{s}_{m}) P(\textbf{s}_{m}) $$ Thank you