From Stochastic Processes by Ross
At the top of page 75 (in the proof of lemma 2.3.4 on the bottom of page 74), the author writes (Summarized here):
Let $\tau_1,...,\tau_n$ denote the ordered values from a set of $n$ independent uniform $(0,t)$ random variables. Let $Y_1, \dots, Y_n$ be independent and iid nonnegative random variables that are also independent of $\{\tau_1, \dots, \tau_n\}$. Then $P(Y_1 + \dots + Y_k < \tau_k, k = 1 \dots n | Y_1 + \dots + Y_n = y) = \{1 - y/t \text{ if } 0 < y < t, 0 \text{ otherwise }\}$. We prove by induction. Suppose it is true for $n-1$. We condition on the values of $Y_1 + \dots + Y_{n-1}$ and $\tau_n$ and then using the fact that conditional on $\tau_n = u, \tau_1 \dots \tau_{n-1}$ are distributed as the order statistics from a set of $n-1$ uniform $(0,u)$ random variables. Doing so for $s < y$ implies $$P(Y_1 + \dots + Y_k < \tau_k, k = 1 \dots n | Y_1 + \dots + Y_{n-1} = s, \tau_n = u, Y_1 + \dots + Y_n = y)$$ $$= P(Y_1 + \dots + Y_k < \tau_k^{*}, k = 1 \dots n-1 | Y_1 + ... + Y_{n-1} = s)$$ for $y < u$, where $\tau_k^{*}$ are the ordered values from a set of $n-1$ independent uniform $(0,u)$ random variables.
I don't understand how he gets $$P(Y_1 + \dots + Y_k < \tau_k, k = 1 \dots n | Y_1 + \dots + Y_{n-1} = s, \tau_n = u, Y_1 + \dots + Y_n = y)$$ $$= P(Y_1 + \dots + Y_k < \tau_k^{*}, k = 1 \dots n-1 | Y_1 + ... + Y_{n-1} = s)$$ from "conditional on $\tau_n = u, \tau_1 \dots \tau_{n-1}$ are distributed as the order statistics from a set of $n-1$ uniform $(0,u)$ random variables."
It seems as if the part before the conditional in $P(Y_1 + \dots + Y_k < \tau_k, k = 1 \dots n | Y_1 + \dots + Y_{n-1} = s, \tau_n = u, Y_1 + \dots + Y_n = y)$ is not concerning the $\tau_n$ individually, but considering two random variables ($Y's$ and $\tau$'s) so I don't understand from which axioms of probability/theorems he is deriving this.
Anyone have any ideas?