Showing that estimator is minimax

414 Views Asked by At

I have the following question. Let $X\sim \text{Bin}(n,p)$ and consider estimating $p\in(0,1)$ with loss function, $$ L(p,\hat{p})=\left(1-\frac{\hat{p}}{p}\right)^2. $$ I need to show that the estimator $\hat{p}=0$ is minimax.

I have an idea on how to do this since I have a theorem that says that if an estimator is admissible and has constant risk then it is minimax. Constant risk is easy to check is this case, \begin{align} \mathbb{E}\left[L(p,0)\right] &= \mathbb{E}\left[\left(1-\frac{0}{p}\right)^2\right]\\ &= \mathbb{E}\left[1\right]\\ &= 1. \end{align}

I am having a harder time showing that $\hat{p}=0$ is admissible. My idea is to assume that it is not admissible; then there exists an estimator $\tilde{p}$ with the property, $$ \mathbb{E}\left[L(p,\tilde{p})\right] \leq 1 $$ for all $p$ and with strict inequality for at least one $p\in(0,1)$. The left-hand side can be expanded, \begin{align} \mathbb{E}\left[L(p,\tilde{p})\right] &= \mathbb{E}\left[\frac{\tilde p^2}{p^2} - 2\frac{\tilde p}{p} + 1\right] \\ &= \frac{\mathbb{E}\left[\tilde p^2\right]}{p^2} - 2\frac{\mathbb{E}\left[\tilde p\right]}{p} + 1. \end{align} Combining this with the definition of inadmissibility above we have, $$ \frac{\mathbb{E}\left[\tilde p^2\right]}{p^2} \leq 2\frac{\mathbb{E}\left[\tilde p\right]}{p}, $$ which can be written, $$ \mathbb{E}\left[\tilde p^2\right] \leq 2p\mathbb{E}\left[\tilde p\right]. $$

Unfortunately this is as far as I can get without some further guidance. Can someone point me in the right direction? I haven't used the fact that $X$ is Binomial...

1

There are 1 best solutions below

3
On BEST ANSWER

$\newcommand\E{\mathbb{E}}$From your last expression, Jensen's inequality tells us that for an estimator to have risk uniformly less than one $$\E[\tilde p]^2 \leq \E[\tilde p^2] \leq 2p\E[\tilde p]$$ so $\E[\tilde p]$ satisfies $x(x-2p) \leq x^2 - 2px \leq 0$ or equivalently $\E[\tilde p]\in[0,2p]$. But since $p$ is could be arbitrarily close to zero this implies $\E[\tilde p]=0$ is necessary for any estimator to have risk uniformly at most one for all $p$.

Plugging this back into the risk expansion we have $$E[L(p,\tilde p)] = 1 + \frac{\E[\tilde p^2]}{p^2}\geq 1$$ so the only estimator which could possibly have risk uniformly less than one must have $\E[\tilde p^2] = 0$ or $\tilde p = 0$ almost surely. $\blacksquare$