Seeking an example for Bayes estimator of two unknown parameters

341 Views Asked by At

I searched the web, taking advantage of several search approaches; however, due to redundancy of the existing information about Bayes estimator of one unknown parameter of random variables (either in single-parameter random variables or supposing the other parameters to be known), I could not find an example which explains, step by step, estimating two unknown parameters of a random variable.

(1) I found this question in which the questioner is seeking for solving such question; but it is just a trial.

(2) I also found some papers about Bayes estimation and prediction of the two-parameter gamma distribution but again I could not find proof of the relation.

Can anyone please cite me a reference in which I can find such example with proof?

1

There are 1 best solutions below

0
On BEST ANSWER

Thanks to Tomas' comment, I went back to my studies and generalized the formula.

Having a random variable with two parameters $p_1$ and $p_2$, the Bayes estimator of $p_1$ (and $p_2$ in a similar manner) is simply \begin{align*} &\hat{p_1}=E[p_1|x]=\int p_1\frac{f(x|p_1)\pi(p_1)}{f(x)}dp_1=\int p_1\frac{\int f(x|p_1,p_2)dp_2\pi(p_1)}{\int\int f(x|p_1,p_2)dp_2\pi(p_1)dp_1}dp_1 \end{align*} where $f(x|p_1,p_2)$ is the joint likelihood of $p_1$ and $p_2$ and $π(p_1)$ is the prior probability distribution function of $p_1$.

Having the above relation, as an example, for Bayes estimator of Negative Binomial distribution the following paper can be suggested

Ganji, M., Eghbali, N., & Azimian, M. (2013). Bayes and Empirical Bayes Estimation of Parameter k in Negative Binomial Distribution. Journal of Hyperstructures, 2(2).