Given a simple lottery game like
- Guess the right (random generated) number $\in [0,1000]$.
- Stake = 1€
- Win= 2001€
the expected outcome is $\frac{1}{1001}\cdot2001 + \frac{1000}{1001}\cdot(-1) = 1$. Hence in the limit, you will win 1€ per play. So far everything is clear. But what happens if I am only allowed to play the game once? The most probable outcome is not winning.
My main question is:
Is there a mathematical concept that says the expected outcome is -1? (since it is the most probable outcome?)
Going further, adding a second choice of not playing the game but instead always getting 0.80€.
Is there a concept that favors the safe 0.80€ choice over playing the game?
For your first part:
The mathematical concept you are looking for is "most probable outcome". The most probable outcome is $-1$. Why would we need another term to describe the most probable outcome? The term we have is perfectly fine. It's descriptive, not too long, and very understandable.
For the second part:
The concept that favors the safe choice would be if you also look at the variance of the random variable. The random variable of playing the game has an expected value of $1$, and a variance of $$E(X^2)-E(X)^2 = \frac{1}{1001}2001^2 + \frac{1000}{1001}1^2 - 1^2 = 4000$$
which is... well, a lot. The "get $0.8$ for sure" game has a variance of $0$, so it's a much safer bet. Note, however, that if you are allowed to play the first game many many times, the variance decreases as the number of times you play increases.