I am working on a problem involving Bayes' Theorem. I want to work out $P(A|X=x)$ for some event $A$ and random variable $X$, given that I know $P(X=x|\Sigma = \sigma)$ and $P(A| \Sigma = \sigma)$ and $P(\Sigma = \sigma)$. In my solution I write
$$P(A|X=x) = \sum_{\sigma}P(A, \Sigma = \sigma|X=x) $$
where the sum is over all possible values of $\Sigma$. Then I use Bayes' theorem $P(a,b|c) = P(a,c|b)\frac{P(b)}{P(c)}$ to write the above as
$$\sum_{\sigma}P(A, X = x| \Sigma = \sigma ) \frac{P(\Sigma =\sigma)}{P(X=x)}.$$
In this problem the first term factors into $A$ and $X$ terms. This leaves the only unknown term in the above is $P(X=x)$. To compute it I write $$P(X=x) = \sum_\sigma P(X=x| \Sigma = \sigma )P(\Sigma = \sigma ).$$
This works fine and dandy if $\Sigma$ is a discrete random variable. But I wonder what is the analogue of the above solution for when $\Sigma$ is a continuous random variable? I am sure it will involve replacing sums with integrals in some way, but I am just not seeing it at the moment.