I have a sample of $n$ random variables with discrete uniform distribution over $\{1, \dots, \theta\}$. I have to prove that $$T = \frac{X_{(n)}^{n+1} - (X_{(n)} - 1)^{n+1}}{X_{(n)}^{n} - (X_{(n)} - 1)^{n}}$$ is an efficient estimator for $\theta$ among unbiased estimators.
This estimator is unbiased, so $\mathsf{E}T = \theta$. Now I have to find $\mathsf{E}(T - \theta)^2$. Since $T$ is an unbiased estimator, I have to find the variance of $T$. So I have to find $\mathsf{E}(T)^2$. Which is where things get complicated.
The formula probably goes like this: $$\mathsf{E}(T)^2 = \sum\limits_{x=1}^{\theta}(\frac{x^{n+1} - (x - 1)^{n+1}}{x^n - (x - 1)^n})^2\cdot\frac{x^n - (x - 1)^n}{\theta^n}$$ (the last fraction is the probability mass function for $X_{(n)}$)
And I have no idea how to proceed from here. Maybe I've been wrong in my approach to the problem. Or maybe there's a really obvious way to deal with $\mathsf{E}(T)^2$ and I just don't see it.
We can prove that the estimator $T$ is the uniformly minimum variance unbiased estimator (UMVUE) of $\theta$ for the discrete uniform distribution over $\{1,2,\ldots,\theta\}$.
First show that $X_{(n)}$ is a complete sufficient statistic for $\theta$. Sufficiency is quite easy to prove using the Factorization theorem. The argument for completeness is slightly more involved.
Since $$E(X_1)=\frac{\theta+1}{2}\,,$$
an unbiased estimator of $\theta$ is $$h(X_1)=2X_1-1$$
By the Lehmann-Scheffe theorem, UMVUE of $\theta$ is given by the conditional expectation $$E(h\mid X_{(n)})$$
Note that the conditional distribution of $X_1$ given $X_{(n)}$ is of the form
\begin{align} P(X_1=j\mid X_{(n)}=y)&=\begin{cases}\frac{y^{n-1}-(y-1)^{n-1}}{y^n-(y-1)^n}&,\text{ if }j=1,2,\ldots,y-1\\\\\frac{y^{n-1}}{y^n-(y-1)^n}&,\text{ if }j=y\\\\\quad0&,\text{ otherwise }\end{cases} \end{align}
So,
\begin{align} E(h\mid X_{(n)}=y)&=2E(X_1\mid X_{(n)}=y)-1 \\&=2\left[\frac{y^{n-1}-(y-1)^{n-1}}{y^n-(y-1)^n}\sum_{j=1}^{y-1}j+y\cdot\frac{y^{n-1}}{y^n-(y-1)^n}\right]-1 \\&=\frac{y^{n+1}-(y-1)^{n+1}}{y^n-(y-1)^n} \end{align}
Thus the UMVUE of $\theta$ is $$T=\frac{X_{(n)}^{n+1}-(X_{(n)}-1)^{n+1}}{X_{(n)}^n-(X_{(n)}-1)^n}$$
Naturally, $T$ is the most efficient estimator of $\theta$ within the unbiased class.
Edit:
To derive the distribution of $X_1$ conditioned on $X_{(n)}$, we first find the joint pmf as follows:
For $j=1,2,\ldots,y-1$,
\begin{align} P(X_1=j,X_{(n)}=y)&=P\left(X_1=j,\max_{2\le i\le n} X_i=y\right) \\&=P(X_1=j)P\left(\max_{2\le i\le n} X_i=y\right) \end{align}
And for $j=y$,
\begin{align} P(X_1=j,X_{(n)}=y)&=P(X_1=y,X_{(n)}=y) \\&=P(X_1=y,X_2\le y,\ldots,X_n\le y) \\&=P(X_1=y)P(X_2\le y,\ldots,X_n\le y) \end{align}