it's been a while since I've done statistics, but I am working on a problem where I have the following pdf: $$f(x;\lambda) =\cfrac{\lambda}{x^{1+\lambda}} \;\;\; 1 \leq x < \infty, \;\;\; \lambda > 0$$
I want to determine if the sample mean estimator $\widehat{\lambda} = E(X) $ is a biased estimator of $\lambda$ or not.
$$E(X) = \int_{1}^{\infty} \cfrac{x\lambda}{x^{1+\lambda}}dx = \int_{1}^{\infty} \cfrac{\lambda}{x^{\lambda}}dx $$
which does not converge for $\lambda <1$. Since $\lambda>0$, is it fair to say this is a biased estimator, from this result alone, or have I made some glaringly obvious mistake here?
For $\lambda > 1$, $E(X) = \cfrac{\lambda}{\lambda-1} \neq \lambda$ it is easy to see, but I guess I was just a little unsure because my integral didn't converge over a set of values my parameter is defined over, and I've never encountered that before.
Thanks!
Sample mean is not equal to $E(X)$ or that integral. It is $\sum X_i / n$.
In cases where Law of Large Numbers is applicable, the expected value of the sample mean of $X$ is equal is to the expected value of $X$.
The expected value of your estimator, i.e. the sample mean, is $\cfrac{\lambda}{\lambda-1}$ in your example when $\lambda > 1$ , and it is $\infty$ in your example when $\lambda \leq 1$.
The definition of bias is: $\text{Bias}(\hat\lambda, \lambda) = \mathsf{E}[ \hat\lambda] - \lambda $
Consider both cases separately: If $\lambda \leq 1$ or $\lambda > 1$.
In the first case, $\mathsf{E}[ \hat\lambda] = \infty$, i.e. $\text{Bias}(\hat\lambda, \lambda) \neq 0 $.
In the second case, clearly, $\text{Bias}(\hat\lambda, \lambda) \neq 0 $
In both cases, it is a biased estimator.