Once in a while I hear people say something like
X is twice as likely as Y.
What they usually mean is:
$$p(X) = 2 \cdot p(Y)$$
and - in the context they refer to - they usually have $p(Y) < \frac{1}{2}$. But what do you do if $p(Y) > \frac{1}{2}$? Can there be an event $X$ that is twice as likely as $Y$? It also feels wrong to me to say that $p(X) = 100 \%$ is twice as likely as $p(Y) = 50\%$.
Is there a good definition what twice as likely means?
Some thoughts about this
Let's call this "twice as likely" a function
$$d: D \rightarrow [0, 1]$$
I would expect $d$ to have the following properties:
- $D = [0, m]\subseteq [0,1]$
- $d(0) = 0 $
- $d$ is monotonous
There's one interpretation which I think makes the most sense: $$p\longrightarrow p:1-p\longrightarrow 2p:1-p\longrightarrow \frac{2p}{1+p}$$ One could also interpret it as picking the best result out of $2$, in which case you would get $$p+(1-p)p=2p-p^2$$ Both are ${\sim}2p$ for low values of $p$.
I think the first is right, because then half as likely matches twice as unlikely. But if you generalize the second to positive reals, you won't get that picking the worst result out of $2$ is the same as being half as likely in the generalization.