I previously asked the following question: A Gambling Game
For a brief summary, $ n>1 $ gamblers in a gambling game each are assigned a random integer in a specified interval $[1,y]$. Let $S$ be the highest random integer assigned and let $I$ be the lowest random integer assigned. The player assigned $S$ wins the game and earns $|S-I|$ dollars. In other words, the winner collects the range of the data set in winnings.
The crucial question: In the long run, does $|S-I|$ depend on $n$? In other words, do we expect the winnings to increase, decrease, or stay the same as the number of players in the game increases?
EDIT: $|S-I|$ seems to tend toward $y-1$ as $n$ grows. With 2 people, I imagine that the winnings average out around half of $y$, but then this value becomes larger (and tends toward $y-1$) as $n$ increases.
Now I am curious what kind of function would model ($n$, winnings) as $n$ ranges from $2$ to $y$ (or some other larger number).