This question more has to do with math than computer science, but there is a computer science component to it.
If I have an algorithm with a time complexity function $$O(\log(\log(n)))$$Why would its upper bound be$$\Theta (\log(n))$$I do not follow the rules of logarithms on this, what steps were used to arrive at this answer?
Suppose that $f$ is $O(\log(\log n))$, and $g$ is $\Theta(\log n)$. Then there are positive constants $c_0$ and $c_1$ and an $m\in\Bbb Z^+$ such that
$$f(n)\le c_0\log(\log n)$$
whenever $n\ge m$ and
$$g(n)\ge c_1\log n$$
whenever $n\ge m$. Moreover,
$$\lim_{n\to\infty}\frac{\log(\log n)}{\log n}=0\;,$$
so we may further assume that $m$ is large enough so that
$$\frac{\log(\log n)}{\log n}\le\frac{c_1}{c_0}$$
whenever $n\ge m$.
But then for $n\ge m$ we have
$$f(n)\le c_0\log(\log n)\le c_1\log n\le g(n)\;,$$
i.e., $g$ is eventually an upper bound for $f$.