convergence in probability of function of random variables

347 Views Asked by At

Suppose that $X_1, X_2, \ldots, X_n$ be a sequence of i.i.d random variables. If we have $E(|X_1|^k) <\infty$ for some $k>0$ and $f(x)$ is a bounded continuous function on $\mathbb{R}$. Is the following theorem true? $$ \frac 1n\sum_{i=1}^n f(X_i)X_i^k \overset{\mathbb{P}}{\longrightarrow} \mu$$

where $\mu = E\left[f(X_1)X_1^k\right]$. Thanks for any comment or suggestion.

1

There are 1 best solutions below

0
On

Hint: If $(X_n)_{n \in \mathbb{N}}$ is a sequence of iid random variables and $g: \mathbb{R} \to \mathbb{R}$ a measurable function, then $(g(X_n))_{n \in \mathbb{N}}$ is a sequence of iid random variables. If $g(X_1) \in L^1$, then by the strong law of large numbers

$$\frac{1}{n} \sum_{i=1}^n g(X_i) \stackrel{\text{a.s.}}{\to} \mathbb{E}g(X_1).$$