A question about tail probabilities of identically distributed variables.

69 Views Asked by At

I have $X_1, X_2, \dots, X_n$ be identically distributed but not necessarily independent random variables with $E[X_j]=0$. I am trying to show that $$\lim_{n \to \infty}P\left[ \max_{1\leq j \leq n} | \sigma_n^{1/2} X_j| > \epsilon \right]=\lim_{n \to \infty}P\left[ \max_{1\leq j \leq n} | X_j| >\frac{1}{\sigma_n^{1/2}} \epsilon \right] = 0, \quad (\forall \, \epsilon>0) $$ where $\sigma_n = \tfrac{\lambda}{n} (1 - \tfrac{\lambda}{n}) = \tfrac{\lambda(n - \lambda)}{n^2}$.

According this post, post, I can show that $$\lim_{n \to \infty}P\left[ \max_{1\leq j \leq n} | X_j| > n \, \epsilon \right] = 0, \quad (\forall \, \epsilon>0) $$

But I'm having trouble adapting the proof to my case.

Update

I have $E[|X|^2]< \infty$.

$\sigma_n > 0$

1

There are 1 best solutions below

0
On BEST ANSWER

Observe that $\sigma_n^{-1/2}\geqslant c_\lambda \sqrt{n}$ for some constant $c_\lambda$ depending only on $\lambda$. Therefore, it suffices to prove that for each positive $\varepsilon$, $$ \lim_{n \to \infty}P\left[ \max_{1\leq j \leq n} | X_j| > \epsilon\sqrt n \right] = 0 $$ which is equivalent to $$ \forall\varepsilon >0, \lim_{n \to \infty}P\left[ \max_{1\leq j \leq n}Y_j > \epsilon n \right] = 0, $$ where $Y_j=X_j^2$, here we are in the situation of the linked post.