Estimation with prior knowledge

108 Views Asked by At

We have i.i.d. samples $\{X_1,\dots, X_n\}$ from $Ber(p)$ where $p$ is a fixed constant. We are interested in estimating $p$.

Given the prior knowledge (before observing the data) that, $p\in [\hat{p}-\epsilon, \hat{p}+\epsilon]$ with probability at least $(1-\delta)$ for some known $\hat{p},\delta,\epsilon>0$, what is the best way to estimate $p$?

I am interested in a method that does not assume anything additional about $p$. E.g. if a prior on $p$ is known a Bayesian estimator can be used. In fact, any assumption on the prior is not possible in the application that I am interested in.

1

There are 1 best solutions below

1
On

Your question is not stated clearly. Is your prior knowledge $$\Pr[|p - \hat p| \le \epsilon] = 1 - \delta$$ prior in the sense of being known before you observe the sample? Are $\epsilon$ and $\delta$ fixed and known values? Is $\hat p$ a statistic or is it a fixed, known quantity? Do you regard $p$ as a random variable with a prior distribution?

In general, your question poorly captures what you seem to want to do, since it is not at all clear what you are trying to accomplish.

For example, you can assume a prior on $p$ that is uniform on $[\hat p - \epsilon, \hat p + \epsilon]$ with total probability $1-\delta$, and uniform with probability $\delta$ otherwise; e.g., $$f(p) = \begin{cases} \frac{1-\delta}{2\epsilon}, & |p - \hat p| \le \epsilon, \\ \frac{\delta}{1-2\epsilon}, & |p - \hat p| > \epsilon \text{ and } p \in [0,1] \end{cases}$$ where $\hat p = \bar x$ is the sample mean, and the posterior predictive distribution for $p$ can be computed for a new sample. But I don't know if this is what you intend.