Comparing the time complexity of two algorithms (inequality)

1.2k Views Asked by At

The question suggests two algorithms spend $T_A(n)=0.0001n^2$, and $T_B(n)=50 \sqrt n$, microseconds respectively, for a problem size of $n$. The question asks, at what input size will Algorithm $A$ become better than $B$. How do I find this?

I found a similar question with a solution that I don't fully understand. The running times of algorithms from a similar question are $T_A = 0.1n^2\log_2 n $ and $T_B = 2.5n^2$

It's solved as follows:

$2.5n^2 < 0.1n^2 \log_2 n$
$2.5 < 0.1 \log_2 n$
$25 < \log_2 n$
$2^{25} < n$

Therefore, Algorithm $B$ is better when $n$ is greater than $2^{25}$

I understand the concept of having to use inequality to prove that $0.0001n^2 < 50 \sqrt n$, however, am having a hard time understanding what's being done in the solution for the example, and how to apply it to the question...

1

There are 1 best solutions below

1
On BEST ANSWER

Suppose that the two algorithms solve the same problem. In that case you can say that Algorithm A is better than B if $T_A$ < $T_B$, because the former runs in a shorter time.

For which (positive) values of n does that $T_A$ < $T_B$ inequality hold? Substituting the expressions of $T_A$ and $T_B$ you get

$0.0001 n^2 < 50 \sqrt n$.

You can learn how one can solve an inequality of this kind with pen and paper e.g. here, or solve it graphically and algebraically using an engine for symbolic math like Wolfram Alpha. If you do that, you will see that $T_A$ is smaller than $T_B$ only if n is smaller than some threshold value. Beyond that threshold, algorithm $B$ "gets better than $A$" because a square root function (like $T_B(n)$) increases more slowly than a quadratic one (like $T_A(n)$) as n increases.