Let $X_1$ and $X_2$ have a bivariate normal distribution with parameters $\mu_1 = \mu_2 = 0$ and $\sigma_1 = \sigma_2 = 1$ and $\rho = 1/2$ Find the probability that all the roots of $X_1x^2+ 2X_2x + X_1 = 0$ are real.
Hint: First establish that $X_1 + X_2$ and $X_1 - X_2$ are bivariate normally distributed and that they are mutually independent.
Attempt:
For the roots to be real, I compute the discriminant of the equation and this probability yields $\mathbb{P(real roots)}= \mathbb{P}(X_2^2 \geq X_1^2) = \mathbb{P}((X_1+X_2)(X_1-X_2) \leq 0)$
From the hint, I have also calculated the covariance of $X_1+ X_2$ and $X_1-X_2$ to be 0, and since these are bivariate normal, they have to be independent variables.
And so, $\mathbb{P}((X_1+X_2)(X_1-X_2) \leq 0) = \mathbb{P}(X_1+X_2\leq 0)\mathbb{P}(X_1-X_2\geq 0)+ \mathbb{P}(X_1-X_2\leq 0)\mathbb{P}(X_1+X_2\geq 0)$ From here, is the only way to proceed on the compute the terms using the joint density function between $X_1$ and $X_2$, or is there a faster approach?