I'm trying to figure out how to solve this equation in terms of $x$.
Mistake I made so far is:
$$\log(2^x) + \log(2^{-x}) = \log(10)$$
$$x \cdot \log(2) - x\cdot \log(2) = 1$$
$$0 = 1$$
I'm trying to figure out how to solve this equation in terms of $x$.
Mistake I made so far is:
$$\log(2^x) + \log(2^{-x}) = \log(10)$$
$$x \cdot \log(2) - x\cdot \log(2) = 1$$
$$0 = 1$$
To make this clearer, make the substitution $u = 2^x$. Then your equation becomes
$$u + \frac 1 u = 10$$
Multiply through by $u$:
$$u^2 + 1 = 10u$$
Bring everything to one side:
$$u^2 - 10u + 1 = 0$$
You can use the quadratic formula to find roots. Keep in mind that $u = \text{stuff}$ is what you're finding. Thus, since $x = \log_2 (u)$, you will then take the $\log_2$ of the solutions of the quadratic to find $x$.
Also, to address your mistake:
$$\log_b(x+y) \neq \log_b(x) + \log_b(y)$$
When you took the log of both sides, you cannot apply it term-by-term, you have to apply it to the entire side of the equation. The property you might be thinking of is
$$\log_b(xy) = \log_b(x) + \log_b(y)$$