The equation $100\log(5x)\log(2x)+1 = 0$ has two distinct real roots $\alpha$ and $\beta$. Find the value of $\alpha\beta$.
I'm having trouble with this because the answer key says $1/10$ as the answer.
However, what I did was to expand the whole equation to make it a quadratic.
Let $a = \log x$
$100(a^2 + (\log5+\log2)a + \log5\log2) +1 = 0$
so my $\alpha\beta = 100\log5\log2 + 1$
But that does not match the answer key at all.
You have made good progress.
The product of the roots of $$ Ax^2+Bx +C=0 $$ is $C/A$
Please make sure that first find your $A$ and $C$ correctly, then divide to get the correct result.