Moderators: I really don't know what to name this or what tags to put so feel free to edit this
I'm a software developer, and in my spare time, I wanted to create a computer program to play a card game. I want it to simulate the future to see the best moves. That is where this problem comes from.
I have a known set that can contain any amount and value of number. If I randomly select 5 different numbers to add together, on average what should their value be?
Now for the hard part. Let's say I want to do this three times, but when played -2's remove themselves after being added and 6's add a 0 to the set. And after being added all numbers are returned to the set (except the -2's). Given a random set, what is the expected sum on each draw?
The issue is that a number removed or added affects the values of all future sets.
start
/ | \
number removed | number added
/ | \ | / | \
add rem Nothing Changes rem add
/ | \ / | \ / | \ / | \ / | \
........................................
Each of those possibilities branches off each time I add the values until we get a statistical mess I don't know how to deal with.
As an example set: (-2 -2 1 1 1 1 1 3 3 3 3 6 6 6 6)
I know for the first draw the odds are:
The number of numbers drawn (5) × The sum of all of the numbers in the set (37) / the number of numbers in the set (15). Or $(n * s) / a$, Or $(5*37)/15 = 12.333...$
But after that, I don't know how to deal with the branching paths.