Multiple conditions for Bayes Theorem to extract multivariate posterior distribution

169 Views Asked by At

I have a set of binomial data $\{r_i\}$ (successes) for a set of $\{N_i\}$ trials for which I know the distribution given success parameter $p$: $P(\{r_i\}\ |\ \{N_i\},\ p)$.

Applying Bayes Theorem, the posterior for p is:

\begin{align} P(p\ |\{r_i\},\ \{N_i\})=\cfrac{P(\{r_i\}\ |\ \{N_i\},\ p)\ P_0(p)}{\int P(\{r_i\}\ |\ \{N_i\},\ p)\ P_0(p)\ dp} \end{align}

where $P_{0}(p)$ is my prior.

How would this change if my success probability $p$ is a function $f(A,\ E_0)$ of two parameters $A$ and $E_0$, for which I consider Gaussian priors $P_0(A)$ and $P_0(E_0)$?

How does one apply Bayes Theorem to get the $P(A,\ E_0\ |\ \{r_i\},\ \{N_i\})$ posterior in this case? Thanks in advance!

1

There are 1 best solutions below

2
On

Your formula with Bayes Theorem is still correct, but the prior for $p$, $P_0(p)$, would be calculated as $$ P_0(p) = \sum_{\{(A, E_0) \ : \ f(A, E_0) = p\}} \ P_0(A) \ P_0(E_0), $$ that is, you would marginalise over $A$ and $E_0$ here. I'm assuming that $f$ is a deterministic function and that if $A, E_0$ are valid values, then $f(A, E_0)$ is a valid value of $p$.

$P(A, E_0 | \{r_i\}, \{N_i\})$ is given by $$ \frac{P(\{r_i\}, \{N_i\} \ | \ A, \ E_0) \ P_0(A) \ P_0(E_0)}{\int P(\{r_i\}, \{N_i\} \ | \ A, \ E_0) \ P_0(A) \ P_0(E_0) \ dA \ dE_0}, $$ and since both $p = f(A, \ E_0)$ and $\{r_i\}, \{N_i\}$ only depend on $A, E_0$ through $p$, then this can be written as $$ P(A, E_0 | \{r_i\}, \{N_i\}) = \frac{P(\{r_i\}, \{N_i\} \ | \ p = f(A, E_0)) \ P_0(A) \ P_0(E_0)}{\int P(\{r_i\}, \{N_i\} \ | \ p = f(A, E_0)) \ P_0(A) \ P_0(E_0) \ dA \ dE_0}. $$