The equation $100\log(5x)\log(2x)+1 = 0$ has two distinct real roots $\alpha$ and $\beta$. Find the value of $\alpha\beta$.

45 Views Asked by At

The equation $100\log(5x)\log(2x)+1 = 0$ has two distinct real roots $\alpha$ and $\beta$. Find the value of $\alpha\beta$.

I'm having trouble with this because the answer key says $1/10$ as the answer.

However, what I did was to expand the whole equation to make it a quadratic.

Let $a = \log x$

$100(a^2 + (\log5+\log2)a + \log5\log2) +1 = 0$

so my $\alpha\beta = 100\log5\log2 + 1$

But that does not match the answer key at all.

2

There are 2 best solutions below

0
On

You have made good progress.

The product of the roots of $$ Ax^2+Bx +C=0 $$ is $C/A$

Please make sure that first find your $A$ and $C$ correctly, then divide to get the correct result.

0
On

Your approach is correct. Note that from $$100(\log^2x+(\log5+\log2)\log x+\log5\log2)+1=0,$$ and assuming base $10$, we have that $$\log^2x+\log x+C=0\implies\log x=\frac{-1\pm K}2$$ where $C$ is a constant and $K=\sqrt\Delta$. Thus $$x=10^{(-1\pm K)/2}\implies\alpha\beta=10^{(-1+K)/2}\cdot10^{(-1-K)/2}=10^{-1}=\frac1{10}$$ as desired.