The question is:
Given a negative binomial distribution of $f(x;\theta) = \frac{4(x+1)}{(2+\theta)^{x+2}}\times\theta^x$, find the MLE for n independent random variables $X_{1}, ..., X_{n}$, and hence determine $\hat{\tau}_{MLE}$ of $\tau = log(\frac{\theta}{2+\theta})$.
Also, given that $E(X_{1}) = \theta$, determine the asymptotic distribution of $\sqrt{n}(\hat{\tau}_{MLE}-\tau)$.
a) The MLE is $\bar{x}$, but I'm not sure where I'm going wrong since I get $\sum x_{i}+n)(\theta +2)(log(\theta + 2))^2 = -(\bar{x}+1)(2-\theta)\sum log(x_{i} +1)$ for the derivation part.
b) and c) Also not too sure how to approach the other two parts.
Any help is appreciated! Thanks!
So, I suggest to look at this link to see what concepts are required to answer this question:
https://stats.stackexchange.com/questions/14471/how-do-you-calculate-standard-errors-for-a-transformation-of-the-mle
(a) If you write the log-likelyhood you have, apart from factors not depending on $\theta$:
$log \mathcal{L}(\theta;x_i) \sim \sum_i x_i log(\theta)-\sum_i (x_i+2)log(\theta+2)$
The condition $\partial_{\theta} log \mathcal{L}(\theta;x_i) =0$ identifies $\theta=\sum_{i}x_i/N$ as the optimal value. Therefore:
$\hat \theta_{MLE}=\frac{1}{N}\sum_i {\hat X_i}$.
and therefore:
$\hat \tau_{MLE}=log\left(\frac{\hat \theta_{MLE}}{\hat \theta_{MLE}+2}\right)$
(b) Following the link, you need the Fisher information matrix. This is nothing misterious. According to my calculations:
$\frac{\partial^2 log \mathcal{L}(\hat X_i; \theta)}{\partial^2 \theta}=-\frac{\hat S}{\theta^2}+\frac{\hat S+2N}{(\theta+2)^2}$
where $\hat S=\sum_i \hat X_i$.
Now considering that $E[\hat S]=N\theta$ everything is on the table. I get:
$\frac{1}{\sigma^2} \sim N\left(\frac{1}{\theta}-\frac{1}{(\theta+2)}\right)$
I will check later the calculations. For the moment you can see that $\sigma^2 \sim O(1/N)$.
(c) Following again the link, all you need to add to point (b) is $g'(x)$ where $g(x)=log(\frac{x}{x+2})$. I leave the calculations to you...