Finding an unbiased estimator for a parameter, dicrete variable

101 Views Asked by At

Let $X : \Omega \to \mathbb{N}$ be a random variable. Define $p_i = P(X=i), \ \ i \in \mathbb{N}$.

Find an unbiased and consistent estimator for $p_1$.

I need to find an estimator $\alpha_n(X_1 + ... + X_n)$ for which I would have $\mathbb{E} \alpha_n = p_1$ and $\alpha_n \to p_1$ in probability or almost surely.

I've found this question but this isn't helpful.

Could you tell me how to look for estimators of a given parameter?

2

There are 2 best solutions below

2
On BEST ANSWER

Here is an answer with the simplest notation I can manage. Maybe you can match ideas here with the content of the previous Answer by @r.e.s.

You want to estimate the probability $p_1$ that $X = 1$ based on a sample of $n$ independent observations from the distribution of $X$. You count $Y_n$, the number of instances among $n$ in which $X = 1.$ Then $Y_n \sim Bin(n, p_1)$. Then an unbiased, consistent estimator of $p_1$ is $\alpha_n = Y_n/n.$ From what you may know about binomial random variables, I suppose you can prove the two necessary relationships for 'unbiased' and 'consistent.'

If you want a concrete situation to think about, imagine being given a die that may or may not be fair. By rolling it $n$ times you want to estimate the probability that the die comes up "1".

0
On

HINT: Proportions are intuitive estimators of probabilities; i.e., to estimate $P(X \in A)$ given i.i.d. observations $X_1,...,X_n$ of $X$, consider the proportion of the $n$ observations that are in $A$: $$\hat{P}(X\in A) = \frac{1_{X_1\in A} + 1_{X_2\in A} +\ ... +\ 1_{X_n\in A}}{n}, $$
where $$1_E = \begin{cases} 1, & \text{if E occurs} \\ 0, &\text{otherwise.} \end{cases} $$

NB: Although you wrote that the estimator should be a function of $X_1 + ... + X_n$, I assume this should be $X_1, ..., X_n$.