Given $X_1,\dots,X_n$ i.i.d. from $P_\theta$, the usual way of showing that a statistic $T$ is sufficient is to use the factorization lemma as opposed to computing the distribution of $(X_1,\dots,X_n)|T$ and showing that it is independent of the parameter of interest $\theta$. Suppose that $X_1,\dots,X_n$ are i.i.d. samples from the uniform distribution of $(0,\theta)$. I want to compute the distribution of $X|X_{(n)}$ and show that it is independent of $\theta$. I am struggling with the rigorous justifications here because $X|X_{(n)}$ does not have a density with respect to Lebesgue measure. What is the correct way to derive the distribution here?
The following is an attempt:
\begin{align*} &P(X_1\le x_1,\dots, X_n \le x_n|X_{(n)}=t)\\ &= \sum_{i=1}^nP(X_1\le x_1,\dots, X_n \le x_n|X_{(n)}=t, X_{(n)}=X_i) P(X_{(n)}=i|X_{(n)}=t)\\ &= \frac{1}{n } \sum_{i=1}^n P(X_1\le x_1,\dots, X_n \le x_n|X_{(n)}=t, X_{(n)}=X_i) \end{align*}
where the last equality holds by the symmetry of the problem, each $X_i$ is as likely to be the largest as any of the others. Now from the discussion here, we have that
$$ X_1,\dots, X_{i-1}, X_{i+1},\dots, X_n|X_{(n)}=X_i= t $$ is distributed as $n-1$ i.i.d. uniform random variables on $[0,t]$. So can we then write:
\begin{align*} &P(X_1\le x_1,\dots, X_n \le x_n|X_{(n)}=t, X_{(n)}=X_i)\\ &P(X_1\le x_1,\dots, X_{i-1}\le x_{i-1}, X_{i +1}\le x_{i+1} \le \dots \le X_n \le x_n|X_{(n)}=t, X_{(n)}=X_i) \times \mathbf{1}\{x_i = t= X_{(n)}\}\\ &= \mathbf{1}\{x_i = t= X_{(n)}\} \prod_{j \neq i} \frac{x_j}{\theta} \mathbf{1}\{ 0 \le x_j \le t \}. \end{align*} Plugging this back in to the above expression yields
\begin{align*} P(X_1\le x_1,\dots, X_n \le x_n|X_{(n)}=t) &= \frac{1}{n } \sum_{i=1}^n \mathbf{1}\{x_i = t= X_{(n)}\} \prod_{j \neq i} \frac{x_j}{\theta} \mathbf{1}\{ 0 \le x_j \le t \} \end{align*}
I'm not completely convinced by the derivation, and I'm not sure what to do next. Intuitively the first indicator in the expression will only be equal to 1 for one of the indices in $i=1,\dots, n$.
Don't worry about the index where the maximum $X_{(n)}$ occurs. Instead, notice that the conditional distribution of $X:=(X_1,X_2,\ldots, X_n)$ given $(X_{(1)},X_{(2)},\ldots,X_{(n)})=(y_1,y_2,\ldots,y_n)$ is uniform over all $n!$ permutations of $(y_1,y_2,\ldots,y_n)$, and hence is free of $\theta$. To find the conditional distribution of $X$ given $X_{(n)}$, write $L:=(X_{(1)},\ldots, X_{(n-1)})$ and $M:=X_{(n)}$ and use $$ P(X\in A\mid M=t) = \int_{s} P(X\in A \mid L=s, M=t)f_{L\mid M}(s\mid t)\, ds $$ The first factor of the integrand is free of $\theta$. So is the second factor, by the result you quoted, so we are done.