Suppose I have a hierarchical Bayesian model, where my observational prediction, $y'$, is calculated as the sum of other parameters, ${\alpha_i}$.
My observation equation (the likelihood) is: $P(y | \{\alpha_i\}, \sigma) \sim N(\sum_i \alpha_i, \sigma^2)$.
I want to perform inference on the $\{\alpha_i\}$ parameters.
Using the usual notation, let $\{\alpha_{-j}\}$ be the subset of $\{\alpha_i\}$ not including $\alpha_j$.
Considering $\alpha_j$, $P(\alpha_j | \{\alpha_{-j}\}, y, \sigma) \sim N(y - \sum_{i \neq j}\alpha_i, \sigma^2)\,P(\alpha_j)$, where $P(\alpha_j)$ is the prior.
Now you can obviously implement a Gibbs sampler for this, sampling each $\alpha_i$ in turn, but the mixing will be slow if there area a lot of $\alpha$ parameters as each parameter depends on the other one.
I am looking to see if there is a way of drawing from the joint distribution for this (which would eliminate this problem), but I can't see how to work out the joint distribution $P(\{\alpha_i\} | y, \sigma)$.
Note 1: This is not a homework problem.
Note 2: my actual problem is more complicated, because the observation equation actually depends on a weighted sum of the $\{\alpha_i\}$, and then I have to deal with priors etc. I think the problem will ultimately be tractable if I can understand how to do this bit.
Many thanks!
OK after I have written this I've realised that the full joint probability is $P(\{\alpha_i\} | y, \sigma) \sim P(y | \{\alpha_i\}, \sigma)\,P(\{\alpha_i\})$ - the answer is right in front of me.
I guess my question relates to how to sample this well - can this expressed in a form where it is multivariate normal for instance? I don't think so due to the interdependence of parameters. Does this mean that Metropolis or Metropolis-Hastings is the only way to go?