Central limit theorem and Slutsky

1.1k Views Asked by At

Let $(A_i)$ be a series of positive iid random variables with $0 < E[A_1^2] < \infty$,

Let $(B_i)$ be a series of iid random variables independent of $(A_i)$ and $0 < \operatorname{Var}[B_1] < \infty$.

Then $$X_n \xrightarrow{d} N\left(0,\frac{\operatorname{Var}[B_1]\,E[A_1^2]}{\left(E[A_1]\right)^2}\right) \qquad \text{as } n \to \infty. $$ where $$X_n = \sqrt{n}\left(\frac{\sum_{i=1}^{n}A_iB_i}{\sum_{i=1}^{n}A_i}-E[B_1]\right).$$ The central limit theorem and the strong law of large numbers seem to come in handy, but somehow it doesn’t really work, since I can’t get the two parts separated in a way that I can apply both theorems to independent parts and then use Slutsky's theorem.

Can someone help?

1

There are 1 best solutions below

0
On

I think you can rearrange the expression of $X_n$ multiplying and dividing by $\sqrt(n)$ to get

$$ X_n = \frac{n}{\sum_{k=1}^nA_k} \cdot \frac{\sum_{k=1}^nA_k(B_k-E[B_k])}{\sqrt{n}}$$

Observe that you can apply the CLT to $\frac{\sum_{k=1}^nA_k(B_k-E[B_k])}{\sqrt{n}}$ where your sequence is $(A_k(B_k -E[B_k]))_k$ and each term in this sequence has mean zero.

By the Law of Large Numbers $\frac{\sum_{k=1}^nA_k}{n} \rightarrow E[A_1]$ a.s, hence converge in probability. Now, if $E[A_k] > 0$ you can apply Slutsky.