I'm working on a problem:
Show using L’Hôpital’s Rule that a running time of $n\log(n)$ is asymptotically faster than (i.e., little-oh of) a running time of $\frac{n^2}{\log(n)}$.`
I suppose a reasonable place to start would be:
$$\frac{n\log(n)}{\frac{n^2}{\log(n)}} $$ $$ = \frac{[\log(n)]^2}{n} $$
Then when you keep on applying L’Hôpital’s rule, I end up with the following:
$$ = \frac{0}{\ln(2)} $$ $$ = 0 $$
I'm not really sure how this proves anything, can someone help explain this to me?
Suppose $f(x),g(x)\to\infty$ as $x\to \infty$. If you want to determine which grows faster, consider $$\lim_{x\to\infty}{f(x)\over g(x)},$$ which is an indeterminate form that L'Hopital's Rule can resolve.
The limit will either be:
In your case, \begin{align*} \lim_{n\to\infty}{n \log n\over {n^2\over \log n}}=\lim_{n\to\infty}{(\log n)^2\over n}=\lim_{n\to\infty}{2\log n\cdot {1\over n\ln 10}\over 1}&={2\over \ln 10}\lim_{n\to\infty}{\log n\over n}\\ &={2\over \ln 10}\lim_{n\to\infty}{{1\over n\ln 10}\over 1}\\ &={2\over (\ln 10)^2}\lim_{n\to\infty}{1\over n}\\ &=0, \end{align*} where L'Hopital's Rule was used for the second and fourth equalities.
Thus, by the discussion above, ${n^2\over \log n}\to\infty$ "faster" than $n\log n$ does as $n\to\infty$.
Hence, a running time of $n\log n$ is "faster" (in the sense that this quantity grows more slowly) than ${n^2\over \log n}$.