Unbiased estimator for a sample.

128 Views Asked by At

Let $X_i, i=1,2,...,n$; a random sample of a distribution $U(0, \theta)$. Prove that $\frac{n+1}{n}X_{(n)}$ is an unbiased estimator of $\theta$.

So what I did is :

$\mathbb{E}\big(\frac{n+1}{n}X_{(n)}\big)=\frac{n+1}{n}\mathbb{E}(X_{(n)})$

And the part that I don't understand is why do $X_{(n)}=\max\{X_1,\ldots, X_n\}$? Because from there it's easy to find the probability density function and its expected value.

I'm following this question/answers for that:

https://math.stackexchange.com/q/744517 and this pdf: https://mcs.utm.utoronto.ca/~nosedal/sta260/sta260-chap9.pdf

2

There are 2 best solutions below

2
On BEST ANSWER

It's a standard terminology. The maximum of $n$ random variables, $X_1, X_2, \cdots, X_n$ is denoted by $X_{(n)}$. Likewise, the minimum is denoted by $X_{(1)}$. In general, the $k$th maximum random variable is denoted by $X_{(n-k+1)}$.

0
On

$X_{(i)}$ denotes the $i$-th order statistics, the $i$-th smallest numbers among $X_1, \ldots, X_n$.

$X_{(n)}$ means the $n$-th smallest numbers, which is the biggest number.