Determining minimum problem size "n"

180 Views Asked by At

my problem is:

The execution time of the first algorithm can be given as a function of the input size $n$ as $f(n) = n^{1.5} \log^2 n$. The execution time of the second algorithm is similarly: $g(n) = n^2$.

Find the minimum problem size $n$ needed so that the fastest asymptotic algorithm becomes faster than the other one.

I am aware that $f(n)$ is faster than $g(n)$ since the execution time of the first function is much lower for most n's. I also know that $g(n)$ is the fastest asymptotically, but I do not understand when the asymptotic algorithm becomes faster than the other one since it appears that $g(n)$ will never be faster than $f(n)$ according to graphs.

1

There are 1 best solutions below

0
On BEST ANSWER

$n^2$ eventually dominates $n^{1.5} log^2 n$.

You want to solve for $n$ (find the values of $n$ where the execution time functions intersect), and then take the ceiling of the largest value. Equations in $x$ involving products of $x$ and $e^x$ are solved using the Lambert W function. See WolframAlpha for a visualization.