We have some $n$ objects indexed with $1$ to $n$.We pick randomly objects one by one without replacement. For every drawn object we get 1$, the game ends if the drawn object has a smaller index than the previous drawn one. (ex: if obj. labeled 3 was drawn after obj. labeled 5 game ends)
Let $ X_k= \begin{cases} 1, & \text{if at least $k$\$ are won} \\ 0, & \text{otherwise} \end{cases} $
Q: what the PMF of $X_k$ and what is the expected amount of $ at the end ?
$X_k$ didn't seem easy to calculate at first I took another event.
Let $ Y_k= \begin{cases} 1, & \text{if exactly $k$\$ are won} \\ 0, & \text{otherwise} \end{cases} $
I took a small case where $k=2,n=4$ to see the pattern:
$P(Y_3=1)=\large\frac{{4 \choose 2}}{\frac{4!}{(4-2)!}}$ generalized it to $P(Y_k=1)=\large\frac{{n \choose k}}{\frac{n!}{(n-k)!}}=\frac{1}{k!}$
with $\large{4 \choose 2}$ I just list all pos. comb. $obj_iobj_j$ where $i<j$
given that the above is correct then $X_k=\sum^n_{i=k}{Y_i}$ but when I calculate the expected win I get $e$ which is most likely wrong.
When you draw $k$ objects you win $k$ dollars if you draw them in increasing order, which is one of the $k!$ possible orders, so $X_k=\frac 1{k!}$. Aside from the last one, then $Y_k=\frac 1{k!}-\frac 1{(k+1)!}=\frac k{(k+1)!}$ The special case is $Y_n=\frac 1{n!}$ because you stop without needing to decrease. The expected win is then $$\sum_{i=1}^{n-1}\frac{i^2}{(i+1)!}+\frac n{n!}$$ Alpha gives me a mess for the sum involving incomplete Gamma functions. The limit for large $n$ is $e-1$