I have a small logarithm related question that I do not seem to understand how to solve.
If a program takes $\log n$ microseconds to run a program of size $n$, what is the maximum size of a program that can run in $1$ second? Here, the base of the logarithm is $2$.
I end up with an equation - $(n 10^6)/ \log n =$ maximum size of the program that can run in $1$ second. However, I'm unsure on how to solve this further.
The answer I found for this question is $2^{(10^6)}$. However, the steps to obtain this answer were not shown. May I please know how to solve this question?
I believe the problem is in your equation. You should have
$$\log n=10^6$$
or the time it takes to run a program of size $n$ is one million microseconds. Now just solve for $n$.