Let's consider $X_1, X_2,...,X_n$ i.i.d. sample from distribution $$f(x; \theta) = \theta (\theta + 1) x^{\theta - 1} (1-x)1_{x \in(0,1)}$$ and $\theta > 0$
I want to derive asymptotic variance for MLE and MME estimators for $\theta$.
My work so far
(1) For MLE we will calculate fisher information matrix:
$$L(X, \theta) = \theta^n (\theta+1)^n (x_1x_2...x_n)^{\theta-1} (1-x_1)(1-x_2)...(1-x_n)1_{(0,1)}(x_1)1_{(0,1)}(x_2)...1_{(0,1)}(x_n)$$ $$lnL(X, \theta) = ln(\theta^n) + ln(\theta+1)^n + ln(x_1...x_n)^{\theta-1}+ln((1-x_1)...(1-x_n)) + ln(1_{(0,1)}(x_1)...,1_{(0,1)}(x_n))$$ $$\ln L(X,\theta) = n\ln(\theta) + (\theta - 1)ln(x1,...,x_n) + ln((1-x_1)...(1-x_n)) + ln(1_{(0,1)}(x_1)...,1_{(0,1)}(x_n))$$ $$\frac{\partial lnL(X, \theta)}{\partial \theta} = \frac{n}{\theta} + \frac{n}{\theta} + ln(x_1,...x_n)$$ $$\frac{\partial^2lnL(X, \theta)}{\partial^2\theta} = -\frac{n}{\theta} - \frac{n}{(\theta + 1)^2}$$
Fisher information matrix is given as $$I(\theta) = -E[\frac{n}{\theta^2} - \frac{n}{(\theta + 1)^2}] = \frac{n}{\theta^2} + \frac{n}{(\theta + 1)^2} = \frac{n(\theta+1)^2 + n\theta^2}{\theta^2(\theta+1)^2}$$
Asymptotic variance then is given as $\frac{1}{I(\theta)} = \frac{\theta^2(\theta+1)^2}{n(\theta+1)^2 + n\theta^2}$
(2) MME
MME estimator is given as $$\hat{\theta} = \frac{2\bar X}{\bar X - 1}$$
Lets define function $g := \frac{2x}{x-1}$, then $g'(x) = - \frac{2}{(x-1)^2}$
We know from delta rule that $\sqrt{n}(g(\bar X) - g(EX)) \rightarrow N(0, Var(X) g'(EX)^2)$ since $\sqrt n (\bar X - EX) \rightarrow N(0, VarX)$.
So the asymptotic variance of MME is given by:
$$VarX\cdot (g'(EX))^2 = \frac{\theta(\theta+2)^2}{2(\theta +3)}$$
I'm skipping calculations of $Var(X) = E[X^2] - [E(X)]^2$ since its not nothing more than calculating integrals.
Questions
Is my asymptotic variance MLE estimator correct? The intuitive problem that I have is that it depends on the sample size.
Is my way of deriving asymptotic variance of MME estimator appropiate?