I thought of proceeding by contrapositive. Assume $f(x)\ge1$ and $f'(x)\le K\in\Bbb R$ for all $x\ge0$. Then for any $t>0$ it follows that \begin{align} \int_0^tf'(x)\ dx=f(t)-f(0)\le Kt\end{align}i.e. $f(t)\le Kt+f(0).$ Both sides must be $\ge1$. thus positive, so $$\frac1{f(t)}\ge\frac{1}{Kt+f(0)}$$which means $\int_0^\infty\frac1f$ diverges.
However I'm assuming $\ f'$ has a finite number of discontinuities, which is not guaranteed - derivatives can be almost everywhere discontinuous, albeit they must also be a.e. continuous, right? More precisely I'm using $f'\in\mathcal{R}[0,\infty]$, does the result hold if this isn't true? How to prove it or what is a counterexample?
You don't need to use derivatives.
By the assumption of Lipschitzianity, there exists a constant $K>0$ such that $|f(x) - f(y)| \leq K |x-y|$ for every $x,y$. In particular, $$ f(t) - f(0) \leq K t \qquad \forall t\geq 0, $$ and then you can proceed as in your proof.