I have found with the following example:
Suppose that we have samples y drawn from a uniformly distributed random variable which extends from 0 to x. We would like to estimate x from the data $y_1,...,y_N$. Let us first discuss the properties of the estimator: $$\hat{X}\left(y\right)=\frac{2}{N}\left(y_1,y_2,...,y_n\right)$$ Since this is just a linear combination of the data, it is easy to calculate the moments of $\hat{X}$ in terms of the moments of the data. For a uniform random variable $Y$ extending from zero to $x$, we know that
$$E\left[Y|x\right]=\frac{1}{2}x$$ $$E\left[Y^2|x\right]=\frac{1}{3}x^2\rightarrow \text{var}\left[Y\right]=\frac{1}{12}x^2$$
Hence when $N$ of these independent random variables are added together, the mean and variances for each variable are just added together. Thus $$E\left[\hat{X}|x\right]=\frac{2}{N}\left(\frac{N}{2}x\right)=x$$ $$\text{var}\left[\hat{X}|x\right]=\frac{4}{N^2}\left(\frac{N}{12}x^2\right)=\frac{x^2}{3N}.$$ The estimate is unbiased, and the variance and mean square error are both equal to $x^2/\left(3N\right)$.
I do not get how to obtain the expected values. Can you help me with this? Any hint/help is welcomed.
The density function for $Y$ is $\frac{1}{x}$ for $0\le y\le x$ and $0$ otherwise. Therefore the moments of $Y$ are $\mu_k=\int_0^x\frac{y^k}{x}dy=\frac {x^k}{k+1}$ The mean is $\mu_1=\frac{x}{2}$ and the second moment $\mu_2=\frac{x^2}{3}$ as you have done.