How to prove convergence in $p$-th power and in probability?

299 Views Asked by At

$\def\e{\mathrm{e}}$Let $X_n$ be i.i.d Gaussian random variables. Prove convergence in probability and $p$-th power of any $p$ for the following sequence of random variables: $$Y_{n} = \frac 1n\sum_{j=1}^\infty X_{j}.$$

The problem is that I do not know how to do it for the Gaussian measure in general because the exercises do not say for the real line. I know I would need to find the expectation of one of the $X_n$ and the variance, but I do not know even how to start. I am lost.

Also, here is my teacher's proof:

It suffices to consider $p$ even integers. The remaining cases will follow using Holder's inequality. We note that$$ E(X^{2k}) = \left. \left( \frac{\partial}{\partial t} \right)^{2k} E(\e^{tX}) \right|_{t = 0}. $$ For $\displaystyle X \equiv \frac{1}{n} \sum\limits_{j = 1}^n X_j$ with $X_j$ being i.i.d Gaussian r.v.'s with mean $0$ and variance $σ^2 \in (0, \infty)$, one has$$ E(\e^{tX}) = \left( E\left( \exp\left( \frac{1}{n} tX_1 \right) \right) \right)^n = \left( \exp\left( \frac{1}{n^2} t^2 σ^2 \right) \right)^n = \exp\left( \frac{1}{n} t^2 σ^2 \right). $$ Hence$$ \left. \left( \frac{\partial}{\partial t} \right)^{2k} E(\e^{tX}) \right|_{t = 0} = \left. \left( \frac{\partial}{\partial t} \right)^{2k} \exp\left( \frac{1}{n} t^2 σ^2 \right) \right|_{t = 0} = \left( \frac{σ^2}{n} \right)^k \left. \left( \frac{\partial}{\partial z} \right)^{2k} \e^{z^2} \right|_{z = 0}. $$ To complete the proof it is enough to notice (by induction w.r.t. $k$) that$$ \left. \left( \frac{\partial}{\partial z} \right)^{2k} \e^{z^2} \right|_{z = 0} > 0. $$

Thank you so much for your help.

2

There are 2 best solutions below

5
On

$\def\Pto{\xrightarrow{P}}$Suppose $X_1, X_2, \cdots \sim N(μ, σ^2)$. For any fixed $ε > 0$, by Chebyshev's inequality,$$ P\left( \left| \frac{1}{n} \sum_{k = 1}^n X_k - μ \right| \geqslant ε\right) \leqslant \frac{1}{ε^2} D\left( \frac{1}{n} \sum_{k = 1}^n X_k \right) = \frac{σ^2}{nε^2}, \quad \forall n \geqslant 1 $$ thus$$ \lim_{n \to \infty} P\left( \left| \frac{1}{n} \sum_{k = 1}^n X_k - μ \right| \geqslant ε\right) = 0. $$ Therfore, $\displaystyle \frac{1}{n} \sum_{k = 1}^n X_k \Pto μ$.


The only way I know to prove the $L^p$-convergence for $p \geqslant 1$ is to use the convergence theorem for backward martingales, which seems to be out of the question. If you want to see this proof, I can add it here.

0
On

Above $Y_n=\frac 1n(X_1+\dots+X_n)$. (The sum stops at $n$.)

Let $p\ge1$ be even more a positive even integer.

This introduces $k$ with $p=2k$, $k\ge 1 $ a natural number. Then we have: $$ \begin{aligned} \|Y_n-0\|_p^p &= \Bbb E[\ Y_n^p\ ] \\ &= \Bbb E[\ Y_n^p\exp(tY_n)\ ]\text{ computed in }t=0 \\ &= \Bbb E[\ \partial_t^p\exp(tY_n)\ ]\text{ computed in }t=0 \\ &= \partial_t^p\ \Bbb E[\ \exp(tY_n)\ ]\text{ computed in }t=0 \\ &= \partial_t^p\ \Bbb E\left[\ \exp\left(\frac tn\sum_{1\le j\le n}X_j\right)\ \right]\text{ computed in }t=0 \\ &= \partial_t^p\ \Bbb E\left[\ \prod_{1\le j\le n}\exp\left(\frac tnX_j\right)\ \right]\text{ computed in }t=0 \\ &= \partial_t^p\ \prod_{1\le j\le n}\Bbb E\left[\ \exp\left(\frac tnX_j\right)\ \right]\text{ computed in }t=0 \\ &= \partial_t^p\ \prod_{1\le j\le n}\exp\left(\frac {t^2}{n^2}\sigma^2\right)\text{ computed in }t=0 \\ &= \partial_t^p\ \exp\left(n\cdot \frac {t^2}{n^2}\sigma^2\right)\text{ computed in }t=0 \\ &= \partial_t^p\ \exp\left(\frac {t^2}n\sigma^2\right)\text{ computed in }t=0 \\ &= \left(\frac {\sigma^2}n\right)^p\ \partial_t^p\ \exp\left(\; t^2\; \right)\text{ computed in }t=0 \\ &= \left(\frac {\sigma^2}n\right)^{2k}\ \partial_t^{2k}\ \exp\left(\; t^2\; \right)\text{ computed in }t=0 \\ &= \left(\frac {\sigma^2}n\right)^{2k}\ \partial_t^{2k}\ \left(\ \frac 1{0!}+\frac 1{1!}t^2+\dots+\frac 1{k!}t^{2k}+\dots \; \right)\text{ computed in }t=0 \\ &= \left(\frac {\sigma^2}n\right)^{2k}\ \left(\ 0+0+\dots+\frac 1{k!}(2k)!t^0+\dots \; \right)\text{ computed in }t=0 \\ &= \left(\frac {\sigma^2}n\right)^{2k}\ \frac {(2k)!}{k!} \\ &= \left(\frac {\sigma^2}n\right)^p\ \frac {p!}{(p/2)!}\ . \\\\ \text{ So we get:}& \\\\ \|Y_n-0\|_p^p &= \frac {\sigma^2}n\cdot\left(\frac {p!}{(p/2)!}\right)^{1/p} \\ &\to 0\text{ for }n\to\infty\ . \end{aligned} $$