Sum of random variables tilted by Bernoullies

87 Views Asked by At

Let $(X_i)_{i\in\mathbb{N}}$ be a sequence of real valued i.i.d. random variables for which the following convergence in probability holds

$$\frac{1}{n}\sum_{i=1}^nX_i\stackrel{n\to\infty}{\longrightarrow}c,$$

where $c\in\mathbb{R}.$

Now consider a sequence $(B_i)_{i\in\mathbb{N}}$ of i.i.d. Bernoulli random variables in $\{0,1\}$ of parameter $p.$ I'm expecting that the following convergence in probability holds:

$$\frac{1}{n}\sum_{i=1}^nX_iB_i\stackrel{n\to\infty}{\longrightarrow}cp.$$

Any idea how to prove it?

2

There are 2 best solutions below

8
On

Suppose $X_i$ have mean $\mu$ and variance $\sigma^2$. By i.i.d. of $X_i$, we must have $\overline{X}$ have mean $\mu$ and variance $\sigma^2/n$. But $\overline{X}$ converges to $c$ itself, so $\mu = c$. Then by the (weak) law of large numbers, \begin{align*} \frac{1}{n}\sum_{i=1}^{n}X_i B_i \overset{\mathcal{P}}{\rightarrow} \mathbb{E}[X_1 B_1] = \mathbb{E}[X_1] \mathbb{E}[B_1] = cp \end{align*}

0
On

Let me start with two remarks:

  • If $X_1$ is integrable, then it follows from the strong law of large numbers that $c = \mathbb{E}(X_1)$. Since the iid random variables $X_i B_i$ are clearly also integrable, another application of the strong law of large numbers yields $$\frac{1}{n} \sum_{i=1}^n X_i B_i \to \mathbb{E}(X_1 B_1) = \mathbb{E}(X_1) \mathbb{E}(B_1) = cp \quad \text{a.s.}$$
  • If $\frac{1}{n} \sum_{i=1}^n X_i$ converges almost surely to $c$, then it follows from the converse of the strong law of large numbers that $\mathbb{E}(|X_1|)< \infty$ (see e.g. this question). Hence, we can reason as in the first remark to obtain the desired convergence.

Now let's consider the general case. There is the following characterization of the weak law of large numbers which can be, for instance, found in the book by Kallenberg (Theorem 4.4.16).

Theorem: Let $(\xi_i)_{i \geq 1}$ be a sequence of iid random variables and let $c \in \mathbb{R}$. Then $n^{-1} \sum_{i=1}^n \xi_i \to c$ in probability if, and only if, $$r \mathbb{P}(|\xi_1| > r) \xrightarrow[]{r \to \infty} 0 \quad \text{and} \quad \mathbb{E}(\xi_1 1_{\{|\xi_1| \leq r\}}) \xrightarrow[]{r \to \infty} c.$$

Let $(X_i)_i$ be a sequence of iid random variables such that

$$\frac{1}{n} \sum_{i=1}^n X_i \to c \quad \text{in probability}$$ for some $c \in \mathbb{R}$, and let $(B_i)_{i \geq 1}$ be an independent sequence of iid Bernoulli random variables. Applying the theorem we get $$\mathbb{P}(|X_1|>r) \xrightarrow[]{r \to \infty} 0 \quad \text{and} \quad \mathbb{E}(|X_1| 1_{|X_1| \leq r}) \xrightarrow[]{r \to \infty} c.$$ Since $B_1$ takes only the values $0$ and $1$ we find that $\xi_1 := X_1 B_1$ satisfies

$$\mathbb{P}(|\xi_1| > r) \leq \mathbb{P}(|X_1|>r) \xrightarrow[]{r \to \infty} 0.$$

On the other hand, we have

\begin{align*}\mathbb{E}(\xi_1 1_{\{|\xi_1| \leq r\}}) &= \mathbb{E}(X_1 1_{\{|X_1| \leq r\}} 1_{\{B_1=1\}}) \\ &= \mathbb{E}(X_1 1_{\{|X_1| \leq r\}}) \mathbb{P}(B_1=1)\\ &\xrightarrow[]{r \to \infty} cp. \end{align*}

Applying the above theorem for $\xi_i := X_i B_i$, we conclude that

$$\frac{1}{n} \sum_{i=1}^n X_i B_i \xrightarrow[]{n \to \infty} cp \quad \text{in probability}.$$