How to show $\sup_{\theta\in(0,1)}|\log(\theta)(\bar{X}_n-\mathbb{E}[X_i])|\xrightarrow[]{p}0$?

29 Views Asked by At

Question: Suppose $X_1,\ldots,X_n$ are i.i.d. and $\bar{X}_n=\frac{1}{n}\sum_{i=1}^NX_i$. I want to show $$\sup_{\theta\in(0,1)}|\log(\theta)(\bar{X}_n-\mathbb{E}[X_i])|\xrightarrow[]{p}0$$

My thoughts:

  1. By definition I need to show $$\lim_{n\to\infty}\Pr\left(\sup_{\theta\in(0,1)}|\log(\theta)(\bar{X}_n-\mathbb{E}[X_i])|>\varepsilon\right)=\lim_{n\to\infty}\Pr\left(\sup_{\theta\in(0,1)}|\log(\theta)|\cdot|\bar{X}_n-\mathbb{E}[X_i]|>\varepsilon\right)=0$$ for all $\varepsilon>0$;
  2. Apparently $|\bar{X}_n-\mathbb{E}[X_i]|\xrightarrow[]{p}0$ by WLLN, so $$\lim_{n\to\infty}\Pr\left(|\bar{X}_n-\mathbb{E}[X_i]|>\frac{\varepsilon}{\sup_{\theta\in(0,1)}|\log(\theta)|}\right)=0$$ for all $\varepsilon>0$

The problem is $\sup_{\theta\in(0,1)}|\log(\theta)|$ doesn't exist (I think), so I guess the above is not really correct. I wonder if there is a way to show the uniform convergence in question, or if there's no uniform convergence in the first place

Some additional context: I'm trying to show that the maximum-likelihood estimator of a logarithmic series distribution is consistent. In particular $X_i$ are i.i.d. with pmf $$f(x;\theta)=\frac{-\theta^x}{x\log(1-\theta)}\quad\text{for}\quad x=1,2,\ldots$$ Since the MLE has no closed form, I'm reformulating the estimator as an M-estimator with criterion function $$S_n(\theta)=\frac{1}{n}\sum_{i=1}^n\left(\log(-\log(1-\theta))-X_i\log(\theta)\right)=\log(-\log(1-\theta))-\bar{X}_n\log(\theta)$$ Now you see in order to show consistency I need to show $S_n(\theta)$ converges in probability to $$S(\theta)=\mathbb{E}[\log(-\log(1-\theta))-X_i\log(\theta)]=\log(-\log(1-\theta))-\mathbb{E}[X_i]\log(\theta)$$ uniformly on $(0,1)$, thus the question. Have tried to appeal to the Uniform Law of Large Numbers but couldn't find a function to bound $|\log(-\log(1-\theta))-X_i\log(\theta)|$.