Suppose we sample $k+1$ i.i.d. random variables $X_i$ uniformly at random in $[0,1]$.
What is
$$
\mathbb{E}[X_i | X_1 > X_2 > \cdots > X_k \cap X_{k+1} > X_k]?
$$
To start, suppose we sample $k$ i.i.d. random variables $X_i$ uniformly at random in $[0,1]$.
Let $X_i'$ be the $i$'th smallest value. What is $\mathbb{E}[X_i']$?
I believe that it is $\frac{i}{k+1}$ and I will prove it by induction. Recall $\mathbb{E}[X_i'] = \int_0^1 \mathbb{P}(X_i' \geq t) dt$.
For the minimum, we have $\mathbb{P}(X_1' \geq t) = (1-t)^k$ since the smallest number being greater to $t$ is the same as asking all $k$ numbers to be in $[t,1]$. Therefore $$ \mathbb{E}[X_1'] = \int_0^1 \mathbb{P}(X_1' \geq t) dt = \int_0^1 (1-t)^k dt = \frac{1}{k+1}. $$
Now assume that this is true for $X_{i}'$ and let's prove it for $X_{i+1}'$. We will condition over $X_{i} \geq t$. In other words \begin{align} \mathbb{P}(X_{i+1}' \geq t) &= \mathbb{P}(X_{i+1}' \geq t \cap X_{i}' \geq t) +\mathbb{P}(X_{i+1}' \geq t \cap X_{i}' < t) \\ &=\mathbb{P}(X_{i+1}' \geq t \:|\: X_{i}' \geq t) \mathbb{P}(X_i' \geq t) + \mathbb{P}(X_{i+1}' \geq t \cap X_{i}' < t) \\ &= 1 \cdot \mathbb{P}(X_i' \geq t) + \mathbb{P}(X_{i+1}' \geq t \cap X_{i}' < t) \end{align}
The former element of the RHS reduces by induction to $i$, while the latter probability is simply the probability of having sampled $i$ elements in $[0,t)$ and $k-i$ elements in $[t,1]$, which is $$ \mathbb{P}(X_{i+1}' \geq t \cap X_i' < t) = {k \choose i} t^i (1-t)^{k-i}dt. $$ Therefore the expected value is \begin{align} \mathbb{E}[X_{i+1}'] &= \int_0^1 \mathbb{P}(X_{i+1}' \geq t)dt \\ &= \int_0^1 \mathbb{P}(X_{i}' \geq t)dt + \int_0^1{k \choose i} t^i (1-t)^{k-i}dt \\ &=\frac{i}{k+1} + \frac{k!}{i!(k-i)!}\frac{i!(k-i)!}{(k+1)!}\\ &= \frac{i+1}{k+1} \end{align} The second integral is the beta integral $\int_0^1 t^a(1-t)^b=\frac{ a! b!}{(a+b+1)!}$, which can be easily proven by induction on $b$.
Now consider the following question:
Assume you sample $k$ values from the same distribution until $X_1 > X_2 > \ldots > X_k$. What is $\mathbb{E}[X_i]$, or in other words $\mathbb{E}[X_i | X_1 > X_2 >\ldots > X_k]$?
The only difference here is that $X_i$ are order from the largest to the smallest value, but I would say that the same arguments as above stand and $\mathbb{E}[X_i | X_1 > X_2 >\ldots > X_k] = 1-\frac{i}{k+1}$ (because $X_i' \leftrightarrow X_{k+1-i}$).
Is this correct?
My second and the main question is the following.
Assume we sample $k+1$ elements until $X_1 > X_2 > \cdots > X_k$ and $X_{k+1} > X_k$.
What is $\mathbb{E}[X_i | X_1 > X_2 > \cdots > X_k \cap X_{k+1} > X_k]$.
In other words we sample until the first $k$ elements are in decreasing order, but the $k+1$'st one is not.
Then, how do I compute $\mathbb{E}[X_i]$ ?
EDIT: The motivation for this question comes from About problem A4 2022 of Putnam
If $X_1, X_2, \ldots, X_k, X_{k+1}$ are i.i.d. random sample, then each permutation are equally likely.
Given $X_1 > X_2 > \ldots > X_k$, for a new i.i.d. random sample $X_{k+1}$, it is equally-likely for $X_{k+1}$ to be inserted in one of the following $(k+1)$ intervals: $$\{(-\infty, X_k), (X_k, X_{k-1}), \ldots, (X_2, X_1), (X_1, +\infty)\}$$
assuming the underlying distribution is a continuous distribution and tie can be neglected. So the new rank of the new observation $X_{k+1}$, after inserting in the original sample, will be a discrete uniform distribution on $\{1, 2, \ldots, k+1\}$ depending on which interval is chosen.
Now if we further given $X_{k+1} > X_k$, then the first interval is excluded and the remaining $k$ choices for insertion are still equally-likely. $i$ of the interval will make the new sample greater than $X_i$ and $k-i$ of them will make it less than $X_i$.
To avoid confusion, we still keep the descending ordering, but adding the total number of sample to distinguish between the original sample with the new sample after insertion. Given the original sample,
$$ X_{1:k} > X_{2:k} > \ldots > X_{k:k} $$
There is a probability $i/k$ for the new sample greater than $X_{i:k}$, and a probability of $1-i/k$ for the new sample less than $X_{i:k}$ (but still greater than $X_{k:k}$). So we have
$$ X_{i:k} = \begin{cases} X_{i+1:k+1} & \text{if } X_{k+1} > X_{i:k} & \text{with probability } \displaystyle \frac {i} {k}\\ X_{i:k+1} & \text{if } X_{k+1} < X_{i:k} & \text{with probability } \displaystyle 1 - \frac {i} {k}\end{cases}$$
Therefore by law of total probability,
$$ E[X_{i:k}|X_{1:k} > X_{2:k} > \ldots > X_{k:k} \cap X_{k+1} > X_{k:k}] = E[X_{i+1:k+1}]\frac {i} {k} + E[X_{i:k+1}]\left(1 - \frac {i} {k}\right)$$
In particular when the sample are from $\text{Uniform}(0,1)$, the above become $$ \left(1 - \frac {i+1} {k+1}\right)\frac {i} {k} + \left(1 - \frac {i} {k+1}\right)\left(1 - \frac {i} {k}\right) = \frac {(k-i)i + (k+1-i)(k-i)} {k(k+1)} = \frac {k-i} {k} = 1 - \frac {i} {k}$$
So it does not change the expectation after insertion of the new sample.