How do we find the optimal estimator for $g(\theta) = \theta^{2}$ where $X_{i}\sim\mathcal{N}(\theta,1)$?

66 Views Asked by At

Suppose that we are given a random sample $X_{1},X_{2},\ldots,X_{n}$ where each $X_{i}\sim\mathcal{N}(\theta,1)$. I have been asked to prove the optimal estimator $\delta(X)$ is given by $\delta(X) = \overline{X} - 1/n$. But I have not been able to proceed. As far as I know, when we talk about optimal estimators (in the context of classical inference) we are interested in unbiased estimators with minimum variance. Having said that, could someone give me at least a hint as to how to proceed?

1

There are 1 best solutions below

0
On

The family $\mathcal{P}=\{P\sim\mathcal{N}(\theta,1):\theta\in \mathbb{R}\}$ is an exponential family: indeed, for a random sample of size $n$, $$\begin{aligned}f_\theta(x_1,...,x_n)&=\prod_{1\leq k \leq n}\frac{1}{\sqrt{2\pi}}e^{-\frac{(x_k-\theta)^2}{2}}=\\ &=\frac{1}{(2\pi)^{n/2}}\exp\bigg(-\frac{1}{2}\bigg(\sum_{1\leq k \leq n}x_k^2+n\theta^2-2\theta\sum_{1\leq k \leq n}x_k\bigg)\bigg)=\\ &=\underbrace{\frac{1}{(2\pi)^{n/2}}\exp\bigg(-\frac{1}{2}\sum_{1\leq k \leq n}x_k^2\bigg)}_{=h(x)}\exp\bigg(\underbrace{-\frac{1}{2}n\theta^2}_{-\xi(\theta)}\bigg)\exp\bigg(\underbrace{\theta \sum_{1\leq k \leq n}x_k}_{\eta(\theta) T(x)}\bigg)\end{aligned}$$ where, as desired, $h,T$ depend only on $x\in \mathbb{R}^n$, while $\xi,\eta$ depend only on $\theta$. We are in a very tractable case: (1) we know a sufficient and complete statistic (see the Lehmann-Scheffé theorem); (2) we know the distribution of the sufficient and complete statistic: $T(X)\sim \mathcal{N}(n\theta,n)$; (3) we know that $$E[T(X)^2]=V[T(X)]+E[T(X)]^2=n+n^2\theta^2$$ We can conclude: define $u(X)=(T(X)/n)^2-1/n=\overline{X}^2-1/n$ and we get $E[u(X)]=\theta^2$.