In an article I'm reading, $m$ is a natural number, $$ H_m = 1 + 1/2 + 1/3 + \cdots + 1/m, $$ the $m$-th Harmonic number, $$ p_i=\frac{1}{i H_m} \quad \mathrm{for} \quad i \in \{1, \ldots, m\} $$ (so that the $p_i$ are the values of a probability distribution), $$ \Theta_m(t)=\prod_{i=1}^m (1-e^{-p_i t}) $$ for $t>0$, and $$ F_m(x) = -\log \Theta_m(xm \log m H_m), $$ which I take to mean $-\log \Theta_m(xm (\log m) H_m)$. It is said that for fixed $x$, as $m \to \infty$, we have $$ F_m(x) \to \infty, \quad \mathrm{if} \quad x<1 $$ and $$ F_m(x) \to 0, \quad \mathrm{if} \quad x \geq 1. $$ I have tried to prove this using the bounds $$ -m \log(1-m^{-xm}) \leq -\log \Theta_m(xm (\log m) H_m) \leq -m \log(1-m^{-x}), \tag{1} $$ which I derive from $$ \frac{1}{mH_m} \leq p_i \leq \frac{1}{H_m}, \quad i \in \{1, \ldots, m\}. $$ Fixing $x > 1$ (my argument doesn't work for $x=1$, which may or may not matter later) and applying L'Hopital's rule to the upper bound in (1), I find that it tends to $0$ as $m$ tends to infinity, which suffices to prove that $F_m(x)$ tends to $0$ as $m$ tends to infinity, as is required. However, fixing $0<x<1$ and applying L'Hopital's rule several times to the lower bound in (1), I find that it also tends to $0$ as $m$ tends to infinity, which is not sufficient to prove that $F_m(x)$ tends to infinity. I suppose that my bound $p_i \leq \frac{1}{H_m}$ for all $i$ is not sharp enough.
My next attempt was to use the downward-concavity in $p$ of the function $\log(1-e^{-pt})$, which can be verified by differentiating twice with respect to $p$. This yields $$ -\log \Theta_m(t) \geq -m \log(1-e^{-\frac{t}{m}}), $$ hence $$ F_m(x) \geq -m \log(1-m^{-x H_m}). $$ Certainly, $m^{-xH_m}$ tends to $0$ more slowly than does $m^{-xm}$, which suggests a better chance of success than with the previous bound. In order to apply L'Hopital's rule to compute $-\lim_{m \to \infty} m \log(1-m^{-x H_m})$ for $0<x<1$, I think I need to extend $H_m$ to the real numbers using $$ H_m = \psi(m+1)+\gamma, $$ where $\psi$ is the digamma function and $\gamma$ the Euler-Mascheroni constant. However, according to Wolfram Alpha, the limit when $x=0.9$ is $0$, not infinity, as hoped.
How can I prove that $F_m(x) \to \infty$ when $x<1$ ?
Concavity of the logarithm tells us $\log(1-a)\le -a$ for all $a<1$. So $$F_m(x)= -\sum_{i=1}^m\log\left(1-e^{-x(m/i)\log m}\right)\ge \sum_{i=1}^m e^{-x(m/i)\log m}.$$ If $0<x<1$, for values of $i$ in the range $\sqrt x m \le i \le m$ we have $$\sqrt x \le \frac i m \le 1$$ and hence $$ e^{-\frac 1{\sqrt x}}\le e^{-\frac m i }\le e^{-1}$$ and $$ e^{-\sqrt x\log m }\le e^{-x\frac m i\log m }\le e^{-x\log m}. $$ The sum of these lower bounds is $$\sum_{i=\sqrt x m}^m e^{-\sqrt x\log m }\approx (1-\sqrt x)e^{(1-\sqrt x)\log m},$$ which grows without bound as $m\to\infty$.