Assume that a density $f(x;\theta)$ is distinct for different values of parameter $\theta$ and has common support for all $\theta$. To show that for $\theta \neq \theta_0$ (=true value that generates samples), $$\lim_{n \to \infty} P\left(L(\theta_0;x_1,...,x_n)>L(\theta;x_1,...,x_n)\right)=1$$ where $L$ is a likelihood function of the density $f$, my textbook proved it by showing that $$\frac{1}{n}\sum_{i=1}^{n} \log\left(\frac{f(x;\theta)}{f(x;\theta_0)}\right)<0 $$ which by the WLLN, as $n \longrightarrow \infty$, the left hand side is equal to $$E\left[\log\left(\frac{f(x;\theta)}{f(x;\theta_0)}\right) \right] < \log E\left[\frac{f(x;\theta)}{f(x;\theta_0)}\right]=\log(1)=0$$ by Jensen's inequality.
But I would like to know whether the following solution can also be used.
From AM-GM inequality, $$\left(\prod_{i=1}^{n} f(x_i;\theta)\right)^{1/n} = L(\theta;x_1,...,x_n)^{1/n} \leq \frac{\sum_{i=1}^{n} f(x_i;\theta)}{n}$$ As $n \longrightarrow \infty$, by the WLLN $$L(\theta;x_1,...,x_n)^{1/n} \leq E[f(X;\theta)]= \int f(x;\theta)f(x;\theta_0)dx = \left<f(x;\theta),f(x;\theta_0)\right>$$ From Cauchy-Schwarz inequality, the inner product between two vectors is maximized when they are in the same direction. So the strict maximum of $L(\theta;x_1,...,x_n)$ is attained when $f(x;\theta)=f(x;\theta_0)$ by the assumption of the distinct density for different $\theta$, which implies $\theta = \theta_0$. Hence, as $n \longrightarrow \infty$, $P(L(\theta;x_1,...,x_n) < L(\theta_0;x_1,...,x_n))=1$.