In a certain game, a player can put his money in two piles labelled A and B. A fair coin is tossed. If it lands heads, then all the money in pile A is multiplied by a factor $\alpha$ and returned to you. Likewise, if it lands tails, then all the money in pile B is multiplied by a factor $\beta$ and returned to you. Determine an optimal allocation of money into the two piles if i) $\alpha=2$ and $\beta=0.5$; ii) or $\alpha=2$ and $\beta=1.5$.
My thought process is: suppose we place a fraction $f\in[0,1]$ into pile A and $(1-f)$ into pile B. Then, double our expected returns (so that we can ignore the factor of half) are $f\alpha-(1-f)\beta=(\alpha-\beta)f+\beta$. This is a linear relation between the payoff and $f$. My gut tells me I should not put all my money in the same pile, but the monotonicity implies otherwise. Where have I gone wrong?
As long as you have some money, you can play the game again, which changes the expectation from linear to exponential.