I am currently in Grade 12 and came across the following question in a past paper:
$$g(x) = \frac{2}{x+1}+1$$
The question asks: For which values of k will the equation $g(x) = x + k$ have two real roots that are of opposite signs?
After simplifying the equation I come to : $x^2 + kx + (k-3) = 0$
From the question i know to use the discriminant ($b^2 -4ac$) and I know that the discriminant of the function must be greater than zero for two real solutions, however i am unsure as how to have the roots to be of opposite signs.
Nevertheless I continued to simplify the inequality and i came to: $k^2 - 4k + 12$ is greater than zero.
from this step i am unable to factorise and thus i am unable to solve the inequality.
I would appreciate any guidance as to how to get the roots to be of different signs and how to solve the ineqaulity.
The answer according to the memo is that $k<3$.
Note that to get roots to be of each sign would require:
$k^2-4k+12>k^2$
The first part of the Quadratic Formula where the square root of the discriminant would have to be greater than the $b$ term in the formula which is $k$ in this case.
From the above inequality, it is easy to get $k<3$ as a result, right?
Consider 2 values, $x+y$ and $x-y$. Now, for one to be positive and one to be negative, $y>x$ must be true or else both will sure whatever sign $x$ has. If necessary, plug in numbers to see this point as it is rather basic algebra to my mind.
The leap from this to the inequality I have above is that $k$ and $\sqrt{k^-4k+12}$ are each squared first.