Say we have $X \sim unif\{1,\dots,\theta\}$. I am interested in understanding how one can compute $E[\bar{X}|X_{(n)}]$. Where $\bar{X} = \frac{1}{n}\sum_{i=1}^{\theta}X_i$ and $X_{(n)} = max(X_i)$
$$E[\bar{X}|X_{(n)}] = E[\frac{1}{n}\sum_{i=1}^{n}X_i | X_{(n)}]=\frac{1}{n}\sum_{i=1}^{n}E[X_i|X_{(n)}] = E[X|X_{(n)}]$$
So we are interested in: $E[X|X_{(n)}] = \sum xP(x|X_{(n)})$
I am confused as to how we can find $P(x|X_{(n)})$
Here is a try: $P(x|x \leq X_{(n)}) = \frac{P(x , x \leq X_{(n)})}{P(x \leq X_{(n)})} = \frac{P(x)}{P(x \leq X_{(n)})} = \frac{1/\theta}{X_{n}/\theta} = \frac{1}{X_{(n)}}$
Which will give us finally for the conditional expectation $E[X|X_{(n)}] = \frac{X_{(n)}+1}{2} = E[\bar{X}|X_{(n)}]$
My issue with this is that if we have some statistic $U = 2\bar{X}-1$ which is unbiased, and noticing that $X_{(n)}$ is sufficient and complete. By the Lehmann Scheffe theorem, $E[U|X_{(n)}]$ should be UMVUE in particular it should be unbiased.
however we notice that $E[U|X_(n)] = X_{(n)}$ which it self is biased, given $E[X_{(n)}] = \theta - (\frac{\theta -1}{\theta})^n - \dots -(\frac{1}{\theta})^n$
Thank you for your insight!
I assume you have $\{X_1, ..., X_n\}$ i.i.d. and uniform over $\{1, ..., \theta\}$, for some given integer $\theta$. Define $M = \max[X_1, ..., X_n]$. So you want to compute $E[X_1 | M=m]$ for all $m \in \{1, ..., \theta\}$. You can first use Baye's rule to get $P[X_1=i | M=m]$ for all $i \in \{1, ..., \theta\}$.
For $i > m$ we get $P[X_1=i|M=m]=0$. For $i \in \{1, ..., m\}$ we get: $$ P[X_1=i|M=m] = \frac{P[M=m|X_1=i]P[X_1=i]}{P[M=m]} $$ You know $P[X_1=i]=1/\theta$, you can also compute $P[M=m]$ easily. The tricky part is to compute $P[M=m|X_1=i]$ differently for $i=m$ and $i<m$. You will find the probability is the same for all $i \in \{1, ..., m-1\}$, and is only different for $i=m$.
Then $$ E[X_1|M=m] = \sum_{i=1}^m i P[X_1=i|M=m] $$