a) Let $X_{1},...,X_{n}$ be i.i.d Uniform$[0,\theta]$. Show that estimator $\beta(X)=max(X_{1},..,X_{n})$ is a biased estimator for $\theta$.Find an unbiased estimator, based on $\theta$. My attempt:
$E[\beta]$=$\int_{0}^{\theta}\frac{1}{\theta} .max(X_{1},..,X_{n})\,d\theta$ ,but I don't think thats right.
b)For $0<p<1$ define a random variable $X$ by $P(X=0)=p$ and $P(X=1)=1-p$. Find an unbiased estimator for $2p$. Show that there is no unbiased estimator for $p^2$. My attempt: I don't know how to 'conjure up' unbiased estimators. For the second, assume $\gamma$ is an unbiased estimator for $p^2$ and try and get a contradiction. $E(\gamma)=\gamma(0)p+\gamma(1)(1-p)=p^2$ . Is this a contradiction?
Thanks
You seem to be mixing together various incompatible notations and various incompatible objects quite a lot...
For example, the expectation of the estimator is not what you write (which is absurd) but $$ E_\theta(\beta(X))=\int_{[0,\theta]^n}\max\{x_1,x_2,\ldots,x_n\}\,\theta^{-n}\mathrm dx_1\mathrm dx_2\ldots\mathrm dx_n. $$ Alternatively, if one knows that the distribution of $\beta(X)$ has density $g_\theta$ on $[0,\theta]$, then $$ E_\theta(\beta(X))=\int_0^\theta ug_\theta(u)\mathrm du. $$ This second approach is somewhat simpler if one knows $g_\theta$. But can you identify $g_\theta$?
Re 2., one should assume that $\gamma(X)$ (not $\gamma$) is an unbiased estimator of $p$, that is, that the function $\gamma:\{0,1\}\to\mathbb R$ is independent of $p$ and that, for every $p$, $$ E_p(\gamma(X))=p^2. $$ As you noted, this implies that $$ p\gamma(0)+(1-p)\gamma(1)=p^2, $$ that is, for every $p$ in $(0,1)$, $$ p^2+(\gamma(1)-\gamma(0))p-\gamma(1)=0. $$ Can you show that no function $\gamma$ achieves this?