I don't really know how to proof if it is consistent, but I did this and I don't know if someone can tell me if it is ok ::C
calculating the likelihood function:
L($\theta$)=f($x_{1},...,x_{n}$);$\theta$) = $\displaystyle\prod^{n}_{i=1}$($\theta$+1)$x_{i}^{\theta}$ =$(\theta+1)^{n}$$(\displaystyle\sum_{i=1}^{n}$$x_{i})^{\theta}$
And since $\frac{d}{d(\theta)}log(L(\theta)) = \frac{n}{(\theta+1)} + \displaystyle\sum_{i=1}^{n}$$log(x_{i})$, then:
$\hat{\theta}= \frac{-n}{(\displaystyle\sum_{i=1}^{n}log(x_{i}))} - 1$
Now we see if it is consistent, for that we first calculate the Fisher information, then
$I(\theta)= -\mathbb{E} [\frac{\partial^{2} }{\partial\theta^{2}}log(f(X|\theta))]$ =-$\mathbb{E} [\frac{\partial^{2} }{\partial\theta^{2}}log(\theta+1) + \theta log(X)]$ = -$\mathbb{E} [\frac{-1 }{(\theta+1)^{2}}]$=$\frac{1 }{(\theta+1)^{2}}$
And now we calculate the variance of $\hat{\theta}$
$Var(\hat{\theta})= \frac{1}{nI(\theta)}=\frac{(\theta+1)^{2}}{n}$
With this we can see if it is a consistent estimator, then:
$\displaystyle \lim_{n \to \infty}Var(W_{n})=\displaystyle \lim_{n \to \infty}\frac{(\theta+1)^{2}}{n}=0$
Therefore $Var(\hat{\theta}_{n})$ $= 0$ is a consistent estimator.