I would like to know the meaning or signification of a parameter $\beta$ in a Bayesian model. I have a Poisson model like this
$s_{i} \mid \lambda_{i} \sim Poisson(\lambda_{i}t_{i})$
Where
$\lambda_i\mid\beta\sim\mathcal{G}a(\lambda_i\mid 1.8,\beta)$
and
$\beta\sim\mathcal{G}a(\beta\mid 0.01,1)$
I have as data the next table (which can be found in an article called Robust Empirical Bayes Analyses of Event Rates by O'Muircheartaigh & Gaver ):
| System | fails ($s_{i}$) | time ($t_{i}$) |
|---|---|---|
| $1$ | $5$ | $94.32$ |
| $2$ | $1$ | $15.72$ |
| $3$ | $5$ | $62.88$ |
| $4$ | $14$ | $125.76$ |
| $5$ | $13$ | $5.24$ |
| $6$ | $19$ | $31.44$ |
| $7$ | $1$ | $1.048$ |
| $8$ | $1$ | $1.048$ |
| $9$ | $4$ | $2.096$ |
| $10$ | $22$ | $10.48$ |
I tought that it was a kind of priori knowledge but it seems an interpretation too much simplistic. Is something hide that I am not taking into account? Is something that is going to be revealed to me afterwards when I try to find the posterior distribution?
It's a hierarchical prior: there's an additional prior on a parameter of the first level prior.
Here is how to derive the unconditional distribution of $\lambda_i$. There's a trick: write $\mathcal{G}(1.8, \beta) = \frac{2}{\beta}\mathcal{G}(1.8, \frac12) = \frac{2}{\beta}\chi^2_{3.6}$.Now write $\mathcal{G}(0.001, 1) = 2\chi^2_{0.002}$. Finally one has $\lambda_i \sim \frac{\chi^2_{3.6}}{\chi^2_{0.002}}$ with independent numerator and denominator. This looks like a Fisher distribution but the degrees of freedom factors are missing. Actually this is a Beta prime distribution $\beta'(1.8, 0.001)$. Because of the derivation I've just shown, it is sometimes called the Gamma-Gamma distribution.