Implication of Law of Large Numbers

275 Views Asked by At

I'm reading through a proof given for the consistency of the maximum likelihood estimator (MLE) of some parameter $\theta$.

The begins as follows,


Consider maximising

$$\frac{1}{n}l(\theta) = \frac{1}{n}\sum_{i = 1}^n\text{log}f({X}_i|\theta)$$

As $n$ tends to infinity, the law of large numbers implies that

$$\frac{1}{n}l(\theta) \rightarrow \mathbb{E \hspace{0.1cm}log}f({X}|\theta) \hspace{1cm}(1)$$


However, I'm not seeing how they make that conclusion in ${(1)}$. I tried rewriting ${(1)}$ as,

$$\frac{1}{n}l(\theta) \rightarrow \mathbb{E \hspace{0.1cm}log}f({X}|\theta) =\int_{-\infty}^{\infty}\text{log}\hspace{0.1cm}f(x_i|\theta)f(x_i|\theta)\hspace{0.1cm}dx$$

but I still can't make the connection. Could the convergence in $(1)$ be explained?

1

There are 1 best solutions below

1
On BEST ANSWER

Note that ($\text{log}f({X}_i|\theta))_{i\geq 1}$ is a sequence of i.i.d random variables with common expected value $\mathbb{E \hspace{0.1cm}log}f({X}|\theta) $ The result follows directly from the statement of the SLLN.