Log-likelihood statistic convergence.

65 Views Asked by At

If f and g are PDF-s, $D(f||g)<\infty$ and $X_1,X_2,...$ are iid random variables, then is it possible to show that log-likelihood statistic: $\Lambda_n = \log \frac{f(X_1)...f(X_n)}{g(X_1)...g(X_n)}$ converges to some constant $c$ when $n \rightarrow \infty$? D denotes Kullback-Leibner distance.

1

There are 1 best solutions below

0
On

Suppose $h(x)$ is the PDF of $X_i$. Then from the law of large numbers, we have $$\frac{1}{n}\Lambda_n \xrightarrow{\text{a.s.}} \mathbb{E}_{h}\left[\log\frac{f(X)}{g(X)}\right] = \mathbb{E}_{h}\left[\log\frac{h(X)}{g(X)}\right]-\mathbb{E}_{h}\left[\log\frac{h(X)}{f(X)}\right] = D(h\|g) - D(h\|f).$$