The idea behind this question is to attach to a natural number in a "natural" way an entropy, such that "multiplication increases entropy". (Of course one can attach very different entropies to natural numbers such that multiplication reduces entropy, but I will try to give an argument, why this choice is natural.) Let $n$ be a composite number $\ge 2$, $\phi$ the Euler totient function. Suppose a factorization algorithm $A$ outputs a number $X$, $1 \le X \le n-1$ with equal probability $\frac{1}{n-1-\phi(n)}$ such that $1 < \gcd(X,n) < n$. Then we can view $X$ as a random variable and attach the entropy to it $H(X_n):=H(n):= \log_2(n-1-\phi(n))$.
The motivation behind this definition comes from an analogy to physics: I think that "one-way-functions" correspond to the "arrow of time" in physics. Since "the arrow of time" increases entropy, so should "one-way-functions", if they exist. Since integer factorization is known to be a candidate for owf, my idea was to attach an entropy which would increase when multiplying two numbers.
So my question is:
Do we have $H(mn) > H(n) + H(m)$ for all composite numbers $n \ge 2, m \ge 2$?
Yes. It is equivalent to ask whether $$ nm-1-\phi(nm) > (n-1-\phi(n))(m-1-\phi(m)) $$ for all composite numbers $m,n\ge2$. This inequality in turn is equivalent to $$ (m - \phi(m)) \phi(n) + (n \phi(m) - \phi(nm) ) + (m - 1 - \phi(m)) + (n - 1 - \phi(n)) > 0, $$ which is true as the first term in parentheses is positive and all other terms in parentheses are nonnegative. The fact that $n\phi(m) \ge \phi(nm)$ is not obvious, but it is easy to prove: it follows from $$ \frac{\phi(m)}m = \prod_{p\mid m} \bigg( 1-\frac1p \bigg) \ge \prod_{p\mid nm} \bigg( 1-\frac1p \bigg) = \frac{\phi(nm)}{nm}. $$