Finding Bayes estimator with inverse-gamma prior and uniform likelihood.

484 Views Asked by At

Consider an i.i.d. sample of data from a population given by $$Y_i|\Psi \sim \text{Unif}(0, \Psi), \quad i = 1,2,\ldots, n,$$ with $\Psi$ having a prior distribution given by $$\Psi \sim \text{Inv-}\Gamma(\alpha, \beta),$$ where Inv-$\Gamma$ is the inverse Gamma distribution with pdf $$f_{\Psi}(z) = \frac{1}{\beta^\alpha\Gamma(\alpha)}\left(\frac{1}{z}\right)^{\alpha+1}\text{exp}\left\{-\frac{1}{\beta z}\right\}\quad \alpha,\beta > 0.$$

I know that the largest order statistic $Y^{(n)}$ is a sufficient statistic for $\psi$.

If I want to find the Bayesian estimator of $\psi$ under the squared error loss, I know that this can be found via the following (rather heavy) integral: $$\hat{\psi}_B = \int_{Y^{(n)}}^{\infty}\psi\left[\frac{\left(\frac{1}{\psi}\right)^{\alpha + n + 1}e^{-1/({\beta\psi)}}}{\int_{Y^{(n)}}^{\infty}\left[\left(\frac{1}{\psi}\right)^{\alpha + n + 1}e^{-1/({\beta\psi)}}d\psi\right]}\right]d\psi$$

This can be somewhat simplified to $$\hat{\psi}_B = \int_{Y^{(n)}}^{\infty}\left[\frac{\psi^{-(\alpha + n )}e^{-1/({\beta\psi)}}}{\int_{Y^{(n)}}^{\infty}\left[\psi^{-(\alpha + n + 1)}e^{-1/({\beta\psi)}}d\psi\right]}\right]d\psi.$$

Directly solving this integral would be pretty hard I feel.

But, the resource I am trying to learn this from claims that now this integral can be expressed as $$\hat{\psi}_B = \frac{1}{\beta(\alpha + n -1)}\frac{Pr\left(\chi_{2(\alpha + n -1)}^2 < 2/(\beta y^{(n)})\right)}{Pr\left(\chi_{2(\alpha + n)}^2< 2/(\beta y^{(n)})\right)},$$

where the probabilities above are the probability that a $\chi^2$ distributed random variable with the given degrees of freedom is less than $2/(\beta y^{(n)})$ (where $y^{(n)}$ is the maximum of the sample).

I am generally familiar with the fact that the INVERSE-chi-squared distribution has pdf $$f(x|k) = \frac{2^{-k/2}}{\Gamma(k/2)}x^{k/2 -1}e^{-1/2x}.$$

The large and messy integral from above can be rewritten in the form of the Inverse-chi-squared distribution as follows using the transformation $\psi =\frac{\beta x}{2}$ in the pdf of the Inverse-chi-squared distribution given above (this final form is with a lot of factoring out and cancelling of Gamma functions, powers of $\beta$, etc.):

$$\hat{\psi}_B = \beta(\alpha + n -1)\int_{\frac{\beta Y^{(n)}}{2}}^{\infty}\left[\frac{x^{-(\alpha + n )}e^{-1/({\beta x)}}}{\int_{\frac{\beta Y^{(n)}}{2}}^{\infty}\left[x^{-(\alpha + n + 1)}e^{-1/({\beta x)}}dx\right]}\right]dx.$$

It seems that this is tantalizingly close to being able to conclude that this is equivalent to
$$\frac{1}{\beta(\alpha + n -1)}\frac{Pr\left(\chi_{2(\alpha + n -1)}^2 < 2/(\beta y^{(n)})\right)}{Pr\left(\chi_{2(\alpha + n)}^2< 2/(\beta y^{(n)})\right)},$$ but I am not quite sure of the details. What route would I need to take to bridge the gap between the integral expression I have and the ratio of probabilities I want to achieve?