Bayes estimate for loss function $\ell(t,\alpha)=\frac{1}{\alpha^2}(t-\alpha^2)^2$?

100 Views Asked by At

I am given the following info for $\{X_i\}_{i=1}^{n}$: $$X\sim f(X|\alpha)=\alpha X^{-(\alpha+1)}I(X>1).$$ Propose a convenient family of priors and find the Bayes estimate for the loss function $\ell(t,\alpha)=\frac{1}{\alpha^2}(t-\alpha^2)^2$.

The joint distribution is of the form $$f(\underline{X}|\alpha)=\alpha^n\prod_{i=1}^{n}X_i^{-(\alpha+1)}I(X_i>1)=\alpha^n e^{-(\alpha+1)\sum_{i=1}^{n}\log(X_i)}I(X_{(1)}>1).$$ Then, $\text{Gamma}(a,b)$ seem like good priors with a posterior $\text{Gamma}(a+n,\sum_{i = 1}^{n}\log(X_i)+b)$.

To find the posterior:

$$P(\alpha|X)\propto P(X|\alpha)P(\alpha)\propto\alpha^n e^{-\alpha\sum_{i=1}^{n}\log(X_i)}e^{-b\alpha}\alpha^{a-1}$$.

Adding the like terms gives us the posterior. To find the Bayes estimate we minimize with respect to $t$, we consider the following Bayes risk function $$\int \ell(t,\alpha)f(\alpha|\underline{X})d\alpha=\int \frac{1}{\alpha^2}(t-\alpha^2)^2f(\alpha|\underline{X})d\alpha.$$ Since $\frac{\partial}{\partial t}\frac{1}{\alpha^2}(t-\alpha^2)^2=\frac{2t}{\alpha^2}-2$ and $\int (\frac{2t}{\alpha^2}-2)f(\alpha|\underline{X})d\alpha<\infty$, we have $$\frac{\partial}{\partial t}\int \ell(t,\alpha)f(\alpha|\underline{X})d\alpha=2\int \left(\frac{t}{\alpha^2}-1\right)f(\alpha|\underline{X})d\alpha = 2tE\bigg[\frac{1}{\alpha^2}\bigg|\underline{X}\bigg]-2.$$ Setting this equal to $0$ implies the Bayes estimate is $$\hat{t}=\frac{1}{E\bigg[\frac{1}{\alpha^2}\bigg|\underline{X}\bigg]}.$$ Define $R=\sum_{i = 1}^{n}\log(X_i)$. Then \begin{align} E\bigg[\frac{1}{\alpha^2}\bigg|\underline{X}\bigg] & =\int \alpha^{-2}\frac{(R+b)^{n+a}}{\Gamma(n+a)}\alpha^{n+a-1}e^{-\alpha(R+b)}d\alpha \\ & = \frac{(R+b)^{n+a}}{\Gamma(n+a)}\frac{\Gamma(n+a-2)}{(R+b)^{n+a-2}}\int \frac{(R+b)^{n+a-2}}{\Gamma(n+a-2)}\alpha^{n+a-2-1}e^{-\alpha(R+b)}d\alpha \\ & =\frac{(R+b)^{n+a}}{\Gamma(n+a)}\frac{\Gamma(n+a-2)}{(R+b)^{n+a-2}} \\ & =\frac{(R+b)^2}{(n+a)(n+a-1)} \end{align} Then $\hat{t}=\frac{(n+a)(n+a-1)}{(\sum_{i = 1}^{n}\log(X_i)+b)^2}$.

I am asked to find the asymptotic distribution as well.

We have $\frac{(n+a)(n+a-1)}{(\sum_{i = 1}^{n}\log(X_i)+b)^2}\stackrel{p}\to\alpha^2$ so it is a consistent estimator. I am trying to use the Central Limit Theorem, but need to find the variance, is my work up to now correct? If so, how do I find the asymptotic distribution?

1

There are 1 best solutions below

0
On

For the conceptual understanding of the task it is important to keep in mind that, given $X_1, \ldots, X_n$, we are trying to estimate $\alpha > 0$.


  • A mistake you made is that the parameters, $\alpha = a + n$ and $\beta = \sum_{k = 1}^{n} \log(X_k) + b$, you gave for the posterior distribution depend on the sample $(X_1, \ldots, X_n)$, which shouldn't be the case - the parameters should be independent of the input sample. For the computation of the posterior, see this Math.SE question.
  • You also seem to be claiming that $$ \int f(X, \alpha) \; \text{d}(\alpha) = 1, $$ which is not true (if you integrated over $X$, it would be true, because $f$ is a density): consider $n = 1$. Then (with the convention $0 \cdot \infty = 0$) we have $$ \int_{0}^{\infty} \alpha X^{- \alpha - 1} 1_{\{ X > 1 \}}(x) \; \text{d}(\alpha) = 1_{\{ X > 1 \}}(x) \cdot \int_{0}^{\infty} \alpha X^{- \alpha - 1} \; \text{d}(\alpha) = \frac{1}{x \log^2(x)} 1_{\{ X > 1 \}}(x), $$ in particular this always depends on $x$.
  • Also you seem to be claiming that $$ \int \frac{2}{\alpha^2} f(\alpha | \underline{X}) \; \text{d}\alpha = E\left[\frac{1}{\alpha^2} | \underline{X}\right] = \int \alpha^{-2}\frac{(R+b)^{n+a}}{\Gamma(n+a)}\alpha^{n+a-1}e^{-\alpha(R+b)}\; \text{d}\alpha $$ and I don't see how the term on the left and the term on the right should be equal, in particular because the term on the right depends on $a$ and $b$, while the term on the left does not. Lastly, I think those integrals should be definite ones over $(0, \infty)$ and not indefinite ones.