Probability "average" understanding

51 Views Asked by At

This is more of a problem understanding probabilities than an actual question.

In a game I am playing I can use a certain item to try to unlock different levels.

The item will unlock a new level based on those probabilities:

Level 1: 32.68%
Level 2: 29.41%
Level 3: 26.14%
Level 4: 9.80%
Level 5: 1.63%
Level 6: 0.33%

A user has suggested that those probabilities can be interpreted as following:

300 items on average to get a level 6.
16.4% chance of getting a level 6 using 50 items.
28.4% chance of getting a level 6 using 100 items.
63.4% chance of getting a level 6 using 300 items.
96.5% chance of getting a level 6 using 1000 items.

Now I understand how to calculate the probabilities to get to level 6 with X amount of tries.

  1. Calculate the chance to NOT get to level 6 with a single item, in this case it's 0.967.
  2. Multiply this number with itself X times.
  3. Deduct the number produced from the previous calculation from 1.

What I fail to understand is how he calculates the "average" items needed to get to the 6th level and I hope you could help me.

1

There are 1 best solutions below

0
On BEST ANSWER

Your calculation of the chance not to get a level 6 item is off-you slipped a decimal. On one try it is $0.9967$ You have a geometric distribution of the time to success. For a geometric distribution the average time to success is the inverse of the chance of success on the first try. As $\frac 1{0.0033} \approx 300$ ( it is really $303.\overline {03}$) the expected number of tries is about $300$