I have this problem on which I would appreciate some help.
"Consider the quadratic equation $x^2 + Bx + C = 0$ where $B$ and $C$ are independent and are uniformly distributed on $[-n,n]$. Find the probability that the equation has real roots. What happens as $n > \longrightarrow \infty $?"
So, my attempt goes like this.
First note that since their uniformly distributed on $[-n,n]$ we know that $f_B(x) = 1/2n$ and $F_C = \frac{x+n}{2n}$.
Obviously, we are looking for $P(C\leqq(B/2)^2)$.
Now, if we condition on $B=x$ we can use the law of total probability
$P(C\leqq(B/2)^2)=\displaystyle \int_{-\infty}^{\infty} P(C\leqq(B/2)^2 | B=x)f_B(x)dx=\displaystyle \int_{-\infty}^{\infty} P(C\leqq(x/2)^2 | B=x)f_B(x)dx$
which is equal to
$\displaystyle \int_{-\infty}^{\infty} P(C\leqq(x/2)^2)f_B(x)dx = \int_{-\infty}^{\infty} F_C((x/2)^2)f_B(x)dx$
because of their independence.
Thus,
$P(C\leqq(B/2)^2) = \frac{1}{4n^2}\displaystyle \int_{-\infty}^{\infty} (\frac{x}{2})^2 + n dx$
and because we know that $x \in [-n,n]$ we get
$P(C\leqq(B/2)^2) = \frac{1}{4n^2}\displaystyle \int_{-n}^{n} (\frac{x}{2})^2 + n dx$
which evaluates to
$\frac{n}{24} + \frac{1}{2}$.
However, my text book says that the right answer is $\frac{n}{24} + \frac{1}{2}$ for $n\leqq 4$ and $1-2/(3\sqrt{n})$ for $n \geqq 4$. What am I doing wrong in my solution above? Clearly I must do something right, because I get partially the right answer.
Any ideas? Thanks!
Be aware that: $$F_C(x)=\frac{x+n}{2n}\tag1$$ is true for $x\in[-n,n]$ but not for values outside this interval.
If $0<n\leq 4$ then $|b|\leq n$ implies that also $0\leq\frac14b^2\leq n$ so that you can make use of $(1)$ without meeting troubles.
This however is not the case when $n>4$. If e.g. $n=10$ then $\frac14b^2$ takes values in $[0,25]$ and $(1)$ will not be valid for $10<x\leq25$.