Let's say I run a random generator to give me a number of apples when a number hits 0 in a certain range from 0 to 100 based on the percentages below.
50% for 1, 40% for 2, 7% for 3, 2% for 4 and only 1% to get 5
My question is what is the overall average for the above. For example if I had: 50% for 1 and 50% for 2 The average would be 1.5, since probably about half a time the above would give 2, and 1+2/2 = 1.5.
I guess I could use the same method as above, but expand it a little. Still, I am wondering if this sort of calculation has a particular name.
You have pretty much answered your own question. It's
$$ 0.5 \times 1 + 0.4 \times 2 +0.07 \times 3 +0.02 \times 4 +0.01 \times 5. $$
It's called a weighted average: https://en.wikipedia.org/wiki/Weighted_arithmetic_mean