Estimating $n^2$ after observing $X\sim Bin(n,p)$ and knowing $p$

62 Views Asked by At

Let $n\in\mathbb N$ be unknown and let $p\in (0,1]$ be known.

Suppose that we observe $X\sim Bin(n,p)$. This allows us to estimate $\widehat n= X/p$ and we can use the Chernoff inequality to bound the probability that it deviates significantly from $n$.

I'm interested in estimating the quantity $n^2$.

It's clear that using $(X/p)^2$ is not the right approach because

$$\mathbb E[(X/p)^2] = n/p + n(n-1) = n^2 + n(1/p-1).$$

To unbias the estimate, we can use $$ \widehat {n^2} = (X/p)^2 - X/p\cdot(1/p-1). $$

Question 1:

Is this the best unbiased estimator for $n^2$?

Question 2:

How can we get tail bounds for the estimator?


One easy tail bound would come from Chebyshev as bounding the variance seems easy. Can we get sharper (Chernoff-style) bounds?

Since the only random variable here is $X\sim Bin(n,p)$, I guess that this is equivalent to looking for a sharp bound for $X^2$.