Calculating UMVU estimator of an i.i.d sample with Bernoulli distribution

303 Views Asked by At

First we're given $V_1 = 10$ i.i.d. r.v. $R_i^{(1)} \text{, } i=1, \ldots, 10$, with the distribution

$$ P(R_1^{(1)} = 1)=0,1$$ $$ P(R_1^{(1)} = 0)=0,9$$

Then we have $V_2 = 30$ i.i.d r.v. $R_i^{(2)}\text{, } i=1, \ldots, 30$ with the same distribution.

We know the values

$$Z_j = \frac{S_j}{V_j}=\frac{\sum_{i=1}^{V_j}R_i^{(j)}}{V_j} \text{, for} j=1,2$$

How can one determine the unbiased estimator $\hat{\mu}$ for $\mu = E[R_i^{(j)}]$ with the smallest variance (i.e. the UMVU-estimator)?

1

There are 1 best solutions below

1
On

The sample $(R_i^{(j)})_{1\leq i\leq V_j}$ of i.i.d random variables with distribution $R_i^{(j)}\sim \text{Bernoulli}(\mu)$ has the expected value $\mu = E(R_i^{(j)})$. The Bernoulli distribution belongs to the exponential family, and the unknown parameter that we want to estimate is $\mu\in \mathbb{R}$. Here, the minimal sufficient statistic is $$ S_j = \sum_{i=1}^{V_j} R_i^{(j)} \, . $$ The mean $$ Z_j = \frac{S_j}{V_j} = \frac{1}{V_j} \sum_{i=1}^{V_j} R_i^{(j)} $$ is a function of $S_j$, and it is an unbiased estimator of $\mu$: $$ E(Z_j) = \frac{1}{V_j} \sum_{i=1}^{V_j} E(R_i^{(j)}) = \mu \, . $$ Therefore, the mean $\tilde{\mu} = Z_j$ is the UMVU estimator of $\mu$ for the $j$th sample.

The union of two independent i.i.d samples of sizes $V_1$ and $V_2$ is an i.i.d. sample of size $V_1+V_2$. The mean is therefore $$ \hat{\mu} = \frac{S_1 + S_2}{V_1 + V_2} = \frac{V_1}{V_1 + V_2} Z_1 + \frac{V_2}{V_1 + V_2} Z_2 \, , $$ which is a weighted mean of $Z_1$ and $Z_2$.


Notes

(a) If $\mu = 0.1$ is known when carrying on estimation, then the UMVU estimator of the expected value is simply $\hat{\mu} = \mu$.

(b) In the general case (outside of the exponential family), a complete and sufficient unbiased statistic is needed to be UMVU, which may be more difficult to obtain than a minimal sufficient one.