I have a probability distribution for some quantity $A$ given a fixed $B$, i.e. $P(A|B)$. I also have a prior distribution $P(B)$ for $B$. I'm trying to find the distribution $P(A)$.
I had thought about using Bayes' theorem which implies that:
$P(A) = \frac{P(B)P(A|B)}{P(B|A)}$
But I don't have any information about $P(B|A)$, or at least I don't think I do...
I suspect I'm missing something obvious here, can anyone help out?
Thanks!
EDIT
Based on the answers given I fear I might have tried to frame the question in too general a manner. The specific forms I'm working with are as follows:
$P(A|B) = A\exp{\left[-\frac{A^2 + B^2}{2}\right]} I_0\left(AB\right)$ (a slightly modified Rice distribution)
$P(B) \propto B^{-n}$ (generic power-law with $n>0$)
Does this change the situation at all? thanks!
I understand from your question that you have the distributions $P(B)$ and $P(A|B)$, which I assume are functions.
Then the joint distribution is $P(A, B)=P(A|B)P(B)$. Now to find the distribution $P(A)$ just sum $P(A, B)$ over $B$. Or if continuous case use integration over density functions. Refer http://en.wikipedia.org/wiki/Marginal_distribution for more information.
$$P(A=a)=\sum_{b}P(a,b)=\sum_{b}P(A=a|B=b)P(B=b).$$
I guess this is what you want.
Edit (to match the edit of the question): You have continuous case. So use integration as below. I guess $[0 +\infty)$ is sufficient as Rice is in non negative domain.
$$P(A=a)=\int_{-\infty}^{+\infty}P(a,b)db$$