Consider the following model:
$$ \alpha \sim N(0,1)$$ $$ \beta \sim N(0,1)$$ $$ d_i \mid \alpha, \beta \sim \mathrm{Bernoulli}(\Phi(\alpha + \beta x_i))$$
$d_i$ is $1$ if person $i$ has some property, and $0$ if they do not. $i = 1, \dots, N$.
I have found the posterior for $\alpha$ and $\beta$ conditional on all the $d_i$. I am now asked how I may estimate $\mathbb{P}(\beta > 0 \mid d_1, \dots, d_N)$.
$p(\alpha, \beta \mid d_1, \dots, d_N) \propto \exp{(- \frac{\alpha^2 + \beta^2}{2})}\Pi_{i = 1}^N (\Phi(\alpha + \beta x_i))^{d_i} (1 - \Phi(\alpha + \beta x_i))^{1 - d_i} $
I am familiar with the Gibbs Sampler or Metropolis Hastings Monte Carlo methods. I'm not sure how to go about it in this case however.
Much thanks.
You could run your MCMC, and after the burn-in period look at the points in the chain where $d_1,...,d_N$ takes some fixed vector of values, say $\mathbf d$. Then calculate the proportion of $\beta$'s which is greater than zero in this set of points, and you have an estimate for $\mathbb P[\beta > 0 | (d_1,...,d_N)=\mathbf d]$.
On a side note, I assume you meant you have the posterior for $\alpha$ conditional on the $d_i$ as well as $\beta$ (similarly for the posterior of $\beta$), in which caseyou can use the Gibbs sampler (otherwise you would need a more general MH algorithm).Edit: From your given posterior, it seems that you could consider $(\alpha,\beta)$ to be a single parameter vector. So the Gibbs sampler should cycle through $N+1$ steps: Sample $(\alpha,\beta)$ from this posterior (which I hope it is assumed that you are sample from without problem), then sample the $N$ values of $d_i$ given $(\alpha,\beta)$.