Question is,
Find the set of all values of 'a' so that the equation $f(x) = x^2 + (a-3)x + a =0$ has exactly one root $\alpha$ in the interval $(1,2)$ and $f(x+α) = 0$ has exactly one root in the interval $(0,1)$
Question is,
Find the set of all values of 'a' so that the equation $f(x) = x^2 + (a-3)x + a =0$ has exactly one root $\alpha$ in the interval $(1,2)$ and $f(x+α) = 0$ has exactly one root in the interval $(0,1)$
Copyright © 2021 JogjaFile Inc.
The second condition implies that there is exactly one root in $(\alpha,\alpha+1)$. As $\alpha$ is the only root in $(1,2)$, the second root must in fact be in $[2,\alpha+1)$. The way the parabola looks, we must have $f(1)>0$, $f(2)\le 0$, and the difference between the roots must be between $0$ and $1$. Do you know how the the discriminant relates to that difference?