In my textbook, Introduction to Probability and Statistical Applications, s^2 is an unbiased estimator for the variance. However, when this estimator is multiplied by some constant a, the estimator is now biased with a bias of (1-a) multiplied by the variance. Could someone please explain why?
Thank you very much
Unbiased means by definition that the expectation is the variance: $$\mathbb{E}[s^2] = \sigma^2$$ When you multiply by some $a$, the expectation is now, by linearity of expectation, $$\mathbb{E}[as^2] = a\sigma^2$$ so the bias (which is, again by definition, how the expectation of your estimator differs from the quantity you seek to estimate): $$\textrm{bias} =\mathbb{E}[as^2]-\sigma^2 = (a-1)\sigma^2$$