Probability problem with a function $f(x)$ and exponential distribution

42 Views Asked by At

I have the following problem which I can't figure out how to solve it.

Let $a∈R$ and $Y$ be an exponentially distributed random variable with parameter 1.

Furthermore let $f(x):=x^2−ax+Y$ for $x∈R$. Calculate in dependence of $a$ the probability that $f$ has at least one real root.

For which $a$ is this probability greater than $0.5$?

I don't know how to start. Do I need to write insted of Y $e^{-x}$ in $f(x)$ and than work with $f(x)$ as a density function?

1

There are 1 best solutions below

3
On BEST ANSWER

First consider $f(x) = x^2 - ax + b$, then we know this has roots if

$$ a^2 - 4b \geq 0 \implies a^2 \geq 4b \implies b \leq a^2/4 .$$

So the probability of having a root is

$$ P(\text{root}) = P(Y \leq a^2 / 4). $$

So integrating the exponential probability distribution from $0$ to $a^2/4$ will give the probability. Setting this equal to $1/2$ will give the boundary for $a$.