Information inequality If $\theta_0$ is identified $[\theta \neq \theta_0,\implies f(z, \theta) \neq f(z, \theta_0)]$ and $E [\ln f(z, \theta) ] < \infty$ for all $\theta$ then $L(\theta) = E[\ln f(z,\theta)]$ has a unique maximum at $\theta_0$.
Proof By the strict version of Jensen's inequality, for any nonconstant, positive random variable
$$ L(\theta_0) - L(\theta) = E[ { - \ln [f(z,\theta)/f(z,\theta_0)] } ] > - \ln E [ { f(z, \theta)/f(z,\theta_0) } ]= 0. $$
Why this implies then we have unique maximum at $\theta_0$?
Consider the non-strict version (i.e., remove the condition of non-constant):
If there existed some $\theta_1$ that were also a maximum, then $L(\theta_1)=L(\theta_0)$, so $$ 0 = L(\theta_0)-L(\theta_1)= E[-\ln[f(z,\theta_1)/f(z,\theta_0)]] \geq -\ln E[f(z,\theta_1)/f(z,\theta_0)] = 0$$ Equality occurs if and only if $-\ln[f(z,\theta_1)/f(z,\theta_0)]=0$ almost everywhere, so $f(z,\theta_1)=f(z,\theta_0)$ almost everywhere, but we know that $\theta \neq \theta_0 \implies f(z,\theta)\neq f(z,\theta_0)$, so it follows that $\theta_1=\theta_0$, so the maximum is unique.