A fraternity is throwing a party for its members. The cost of booking a band to play at the party, the amount that the supplier is going to charge, the cost of renting some place and some other costs are random uniform variables over the intervals (1300,1800),(1800,2000),(800,1200) and (400,700) respectively. If the number of people invited is a random integer in the interval (150,200). What is the minimum amount, in average, the fraternity will have to charge each person to not lose money?
What I did was: The sum of all the costs and then the average, that gave me 5000, then the average of the attendees, 175. I divided them. But I don't know how to continue.
Here is a simulation in R of total costs for the party and cost per ticket.
You can sum the means of the four uniform distributions to get the mean of the total cost (about $\$5000).$ Also, you can sum the variances of these four distributions (provided they're independent) to get the variance of the total cost; the take the square root to get the standard deviation (about $\$212.)$ The distribution is not exactly normal but nearly normal.
If the probability models for costs and attendance are correct, the fraternity is unlikely to lose money charging $\$33.50$ per ticket. [If the ticket price were set at $\$28.55$ (median of distribution) there would be a 50:50 chance of losing money.]