The theorem is that under certain assumptions (regularity conditions):
$$ \lim_{n\to \infty} P_{\theta_0} [L(\theta _0, X) > L(\theta, X)]=1, \,\,\forall \theta \neq\theta _0$$
I'm having difficulties understanding the proof. The setup is $L(\theta; x) = \prod_{i=1}^n f(x_i; \theta), \,\,\,\theta \in \Omega$.
Then you take the log of the inequality $L(\theta _0, X) > L(\theta, X)$.
I've been able to get this to $\sum_{i=1}^n [f(x_i,\theta) - f(x_i,\theta_0)] < 0$.
According to the book, I should be able to get to $ \frac1n \sum_{i=1}^n \log \Big[\frac{f(X_i;\theta)}{f(X_i;\theta_0)}\Big] < 0$ but I'm not sure how.
Any help would be appreciated. And any corrections to the mathjax would also be appreciated.
References
The theorem is number 6.1.1 in:
Hogg, McKean & Craig, Introduction to Mathematical Statistics, Seventh Edition (Pearson, 2013)
Alternatively, it can be found online in these notes.
I think that you made a mistake, since you didn't take the logarithm of the probability distributions. If you do the problem becomes quite straightforward: $$ \log L(\theta _0, X) > \log L(\theta, X) \hspace{10mm}\Rightarrow\hspace{10mm} \sum_{i=1}^n \log f(x_i,\theta_0) > \sum_{i=1}^n \log f(x_i,\theta) \hspace{10mm}\Rightarrow\hspace{10mm} \sum_{i=1}^n \log f(x_i,\theta) - \log f(x_i,\theta_0) < 0 \hspace{10mm}\Rightarrow\hspace{10mm} \sum_{i=1}^n \log \left(\frac{f(x_i,\theta)}{f(x_i,\theta_0)}\right) < 0. $$ Add terms like $\frac{1}{n}$ to your taste;)