I have a question that first I need to know what is happening, but then I also need to code it in a program called APPL, which is an extension from Maple18 that I really have never used, yet I have been forced to do so. If anyone knows of a site that helps with APPL, all I have found is APPLE stuff that would be great.
Find the mean of the difference between the two real roots of the quadratic $x^2+Bx+C=0$ such that $B~U(2,3)$ and $C~U(0,1)$ where they are independent variables.
Then the prob mass function of X is given as $f_X(x)=\frac{x}{10}$ for $x=1,2,3,4$. I drew 3 observations at random and without replacement from this distribution. Find prob that the second largest observation is equal to 2 with APPL.
I kind of have an idea on how to do number 2 with APPL but I really would just like to know what is going on in number 1.
By the quadratic formula, the roots of the equation $x^2+Bx+C=0$ are $$ \frac{-B\pm\sqrt{B^2-4C}}2 \quad. $$ If $B>2$ and $C<1$ then the roots are both real. You can check that the difference between the roots is $ \Delta:=\sqrt{B^2-4C} $.
If I understand correctly, your job is to repeatedly generate independent copies of $B$ and $C$ and then calculate $\Delta$ from $B$ and $C$. After having done this many times you compute the sample mean of these calculated $\Delta$'s.