I am trying to understand an implication I came across in a research article, and I was wondering if someone could provide more details or clarifications on the reasoning behind it.
The inequality in question is as follows:
$\frac{1}{p} > \frac{\log(p)}{\log(n) \cdot \log(\log(n))}$
where $p$ is the greatest prime in the factorization of $n$ and the claim is that this implies $p < \log n$.
I was wondering if someone could explain the steps that leads to this conclusion. Is there any specific logarithmic property or manipulation that directly justifies $p < \log n$? Or is some additional context needed that hasn't been shared?
Thank you in advance.
Assuming that $n > e$ (so that the denominator on the right is positive) the inequality is equivalent to $$ p \cdot \log(p) < q \cdot \log(q) $$ with $q := \log(n) > 1$. Since the function $x \mapsto x \cdot \log(x)$ is strictly increasing on $[1, \infty)$ it follows that $p < q$.
(The fact that $p$ is a prime factor of $n$ is not needed for this conclusion, only that $p \ge 1$ and $n > e$.)