Problem from Sheldon Ross's Elementary Introduction to Finance (Exercise 4.28):
For an initial investment of 100, an investment yields returns of $X_i$ at the end of period $i$ for $i = 1,2$, where $X_1$ and $X_2$ are independent normal random variables with mean $60$ and variance $25$. What is the probability that the rate of return on this investment is greater than $10$ percent?
In the book he defines the rate of return as the rate so that the present value is equal to the initial payment
What I have so far:
$$0 = -100 + \frac{X_1}{1+r} + \frac{X_2}{(1+r)^2}$$
$$0 = \frac{-X_1 + \sqrt{X_1^2 +400X_2}}{2X_2}$$ What I had in mind was to solve this should give an answer for $\frac{1}{1+r}$. Then by adding 1 and taking the reciprocal I could make an expression for $r$ and then use the random variable's distribution to solve for $r > 0.1$. But it's looking pretty tough to do it this way if it's even possible. Is this approach this correct or if there's an easier way to do this.