Let $X_1,...,X_n$ be i.i.d. $\mathcal{N}(0,1)$ and $\theta \neq 0$, then the likelihood ratio is: $$ \mathcal{L}_{n} = \prod_{i=1}^{n}\frac{f(X_i - \theta)}{f(X_i)} $$ Find $\mathbb{E}\mathcal{L}_n$ and does $\mathcal{L}_{n}$ convege almost surely as $n \rightarrow \infty$, if yes what is it?
My attempt: Finding $\mathbb{E}\mathcal{L}_{n}$ is straightforward $$ \mathbb{E}\mathcal{L}_{n}=\mathbb{E}\exp\left\{\theta\sum_{i=1}^{n}X_i - \frac{n\theta^2}{2}\right\}=\exp\left\{- \frac{n\theta^2}{2}\right\}\mathbb{E}\exp\left\{\theta \underbrace{\sum_{i=1}^{n}X_i}_{Y \sim \mathcal{N}(0,n)}\right\} = 1. $$
However, I'm not sure what does $\mathcal{L}_{n}$ converge to? Using SLLN is my idea but I'm not sure. If it is true, then it should be $0$.
It is no need to calculate expectation of $\mathcal L_n$ with respect to the joint cdf in denominator. It is always $1$ : $$ \mathbb E\mathcal L_n = \iint\limits_{\mathbb R^n} \prod_{i=1}^{n}\frac{f(x_i - \theta)}{f(x_i)} \cdot \prod_{i=1}^{n} f(x_i)\,dx_1\ldots dx_n = \iint\limits_{\mathbb R^n} \prod_{i=1}^{n}f(x_i - \theta)\,dx_1\ldots dx_n =1. $$ Take logarithm of $\mathcal L_n$: $$ \log \mathcal L_n = \sum_{i=1}^n \log \left(\frac{f(X_i-\theta)}{f(X_i)}\right) = \sum_{i=1}^n \log e^{\theta X_i-\frac{\theta^2}{2}} = \sum_{i=1}^n \left(\theta X_i-\frac{\theta^2}{2}\right) $$ The summands $Y_i=\theta X_i-\frac{\theta^2}{2}$ are independent and identically distributed with negative expected value $$ \mathbb E Y_1 =\mathbb E(\theta X_1) -\frac{\theta^2}{2} = -\frac{\theta^2}{2} <0. $$
Therefore the sum of i.i.d. summands with negative expected value tends to $-\infty$ a.s. since by SLLN $$ \frac{\log \mathcal L_n}{n}=\frac{\sum_{i=1}^n Y_i}{n} \to -\frac{\theta^2}{2} <0 \quad \text{a.s.} $$
And therefore $\mathcal L_n=e^{\log \mathcal L_n}\to e^{-\infty}= 0$ a.s.
Note that the logarithm of ratio of two pdfs $\log\left(\frac{g(X)}{f(X)}\right)$ always has negative expected value with respect to $f(x)$. Say, for the case of everywhere strictly positive $f$: $$ \mathbb E_f\log\left(\frac{g(X)}{f(X)}\right) = \int_{\mathbb R}\log\left(\frac{g(x)}{f(x)}\right) \cdot f(x)\,dx = \int_{\mathbb R}\log\left(\frac{g(x)}{f(x)}-1 +1\right) \cdot f(x)\,dx $$ (use $\log(1+t)\leq t$) $$ \leq \int_{\mathbb R}\left(\frac{g(x)}{f(x)}-1\right) \cdot f(x)\,dx = \int_{\mathbb R} g(x)\,dx -\int_{\mathbb R} f(x)\,dx = 0 $$ and the equality is possible only for $f(x)=g(x)$ a.e.