Question: If I am slightly risk averse which do I prefer. (Give a mathematical justification for your conclusion):
[0.5, \$450; 0.5, \$400] or [0.1, \$4375; 0.9, \$0]
Okay so I get that the risk averse person would choose the first option ([0.5, \$450; 0.5, \$400]), but how can I show that mathematically? Simply summing the probabilities times the result doesn't give the right answer, as 0.1*4375 > 0.5*$450+0.5*400.
Edit: Using the following utility function:
U(Sk+n) = -263.31 + 22.09*log(n + 150,000)
Where n is the additional wealth you'd gain (this is new to me, let me know if we need more information... this is just a generic utility function found in a book).
This results in the following values
[-148.91593453] and [-148.915753041]
Well, summing the probabilities times the payoff reflects a situation of indifference to risk, in fact you're computing the expected value, without considering risk aversion (this explains why a computation of that kind would suggest to take the gamble). The mathematical object that fits your problem is a concave function. This function is called utility function, let's denote it by $u$. We say that your utility function denotes risk aversion if it satisfies the following:
\begin{equation} u(\sum x p_x)\geq \sum u(x)p_x \end{equation}
where the sum is taken over all the possible payoffs $x$ weighted with their probability $p_x$. The point is that there are plenty of these functions, and all determine behaviours which are different: you see from your example that the player has to be strongly averse to risk not to take his chances.
Notice that if you let $u$ equal to the identity, you get an equality above. This tells you that the identity (it is the function you were using in the example) describes risk indifference.