Mean Square Error & Bias

451 Views Asked by At

The random variable $Y$ is related to the angle at which muon particles decay. Y has density function:

$$f(y)=\frac{1+\alpha y}{2} \; \; -1 \leq y \leq 1$$

where $\alpha$ is a parameter satisfying $-1 \leq \alpha \leq 1$.

Let $Y_i$ be a random sample of size $n$ from this distribution. Consider the estimator $\hat{\alpha} = 3\bar{Y}$.

a. Determine the bias and mean square error of this estimator.

I've attempted this, and this is what i've come up with, maybe someone can give me a better hint of what i'm doing wrong.

$B(\hat{\alpha}) = E(\hat{\alpha})-\alpha = E(3\bar{Y})=\alpha = 3E(\bar{Y})-\alpha$

$MSE(\hat{\alpha})=E\left[(\hat{\alpha}-\alpha)^2\right]=E\left[(3\bar{Y}-\alpha)^2\right]$

Not quite sure what I'm doing wrong here.

b. Take $n = 100$ and $\alpha = -0.75$. Approximate the probability that $\hat{\alpha}$ is within $0.10$ of $-0.75$.

Not quite sure how to estimate this either.

1

There are 1 best solutions below

1
On BEST ANSWER

Observe that $Y_1, \ldots, Y_n$ are all independent and identically distributed random variables following the distribution of $Y$. So the expectation of the sample mean is $$\operatorname{E}[\bar Y] = \operatorname{E}\left[\frac{1}{n}(Y_1 + \cdots + Y_n) \right] = \frac{1}{n} \sum_{i=1}^n \operatorname{E}[Y_i],$$ by the linearity of expectation. Now we need to compute the expectation of one such $Y$; e.g., through integration of the density: $$\operatorname{E}_\alpha[Y] = \int_{y=-1}^1 y f_Y(y) \, dy = \frac{\alpha}{3}.$$ Ah, so that's why our estimator of $\alpha$ is $3\bar Y$: we then see that $$\operatorname{E}[\hat\alpha] = \alpha,$$ which means this estimator is unbiased.

What is the variance of $\hat\alpha$? Were the sample observations correlated, then we would need their covariance structure, but because we can assume they are independent, we have $$\operatorname{Var}[Y_1 + \cdots + Y_n] \overset{\text{ind}}{=} \operatorname{Var}[Y_1] + \cdots + \operatorname{Var}[Y_n],$$ i.e., when random variables are independent, the variance of their sum is equal to the sum of their variances. So we have $$\operatorname{Var}[\hat\alpha] = \operatorname{Var}[3\bar Y] = 3^2 \operatorname{Var}[\bar Y] = \frac{9}{n^2} (n \operatorname{Var}_\alpha[Y]).$$ Here I've taken a bit of a shortcut, noting that since all the $Y_i$s are identically distributed, like the expectation calculation above, their sum is simply $n$ times the variance of a single $Y$. To get this variance, we now compute the second moment of $Y$: $$\operatorname{E}_\alpha[Y^2] = \int_{y=-1}^1 y^2 f_Y(y) \, dy = \frac{1}{3}.$$ (Note this result is independent of $\alpha$.) Hence $$\operatorname{Var}_\alpha [Y] = \operatorname{E}_\alpha [Y^2] - \operatorname{E}_\alpha [Y]^2 = \frac{1}{3} - \left(\frac{\alpha}{3}\right)^2.$$

Therefore, $$\operatorname{Var}_\alpha[\hat\alpha] = \frac{3-\alpha^2}{n},$$ and because we have established that $\hat\alpha$ is unbiased, the mean square error (being the sum of the variance and the square of the bias) is simply equal to the variance.

As for your part (b), we can use the Central Limit Theorem to calculate the relevant probability, using a normal approximation: that is to say, $$\Pr[|\hat\alpha - \alpha| \le 0.1] \approx \Pr\left[\left|\frac{\hat\alpha-\alpha}{\sqrt{\operatorname{Var}[\hat\alpha]}}\right| \le \frac{1}{10}\sqrt{\frac{n}{3-\alpha^2}} \right] = \Pr\left[|Z| \le \tfrac{4}{\sqrt{39}}\right],$$ where $Z \sim \operatorname{Normal}(0,1)$ is a standard normal random variable, and we have substituted $n = 100$ and $\alpha = -0.75$ into the variance. The remainder is a straightforward reference to a normal distribution table, or a calculator.