$\def\e{\mathrm{e}}$Let $X_n$ be i.i.d Gaussian random variables. Prove convergence in probability and $p$-th power of any $p$ for the following sequence of random variables: $$Y_{n} = \frac 1n\sum_{j=1}^\infty X_{j}.$$
The problem is that I do not know how to do it for the Gaussian measure in general because the exercises do not say for the real line. I know I would need to find the expectation of one of the $X_n$ and the variance, but I do not know even how to start. I am lost.
Also, here is my teacher's proof:
It suffices to consider $p$ even integers. The remaining cases will follow using Holder's inequality. We note that$$ E(X^{2k}) = \left. \left( \frac{\partial}{\partial t} \right)^{2k} E(\e^{tX}) \right|_{t = 0}. $$ For $\displaystyle X \equiv \frac{1}{n} \sum\limits_{j = 1}^n X_j$ with $X_j$ being i.i.d Gaussian r.v.'s with mean $0$ and variance $σ^2 \in (0, \infty)$, one has$$ E(\e^{tX}) = \left( E\left( \exp\left( \frac{1}{n} tX_1 \right) \right) \right)^n = \left( \exp\left( \frac{1}{n^2} t^2 σ^2 \right) \right)^n = \exp\left( \frac{1}{n} t^2 σ^2 \right). $$ Hence$$ \left. \left( \frac{\partial}{\partial t} \right)^{2k} E(\e^{tX}) \right|_{t = 0} = \left. \left( \frac{\partial}{\partial t} \right)^{2k} \exp\left( \frac{1}{n} t^2 σ^2 \right) \right|_{t = 0} = \left( \frac{σ^2}{n} \right)^k \left. \left( \frac{\partial}{\partial z} \right)^{2k} \e^{z^2} \right|_{z = 0}. $$ To complete the proof it is enough to notice (by induction w.r.t. $k$) that$$ \left. \left( \frac{\partial}{\partial z} \right)^{2k} \e^{z^2} \right|_{z = 0} > 0. $$
Thank you so much for your help.
$\def\Pto{\xrightarrow{P}}$Suppose $X_1, X_2, \cdots \sim N(μ, σ^2)$. For any fixed $ε > 0$, by Chebyshev's inequality,$$ P\left( \left| \frac{1}{n} \sum_{k = 1}^n X_k - μ \right| \geqslant ε\right) \leqslant \frac{1}{ε^2} D\left( \frac{1}{n} \sum_{k = 1}^n X_k \right) = \frac{σ^2}{nε^2}, \quad \forall n \geqslant 1 $$ thus$$ \lim_{n \to \infty} P\left( \left| \frac{1}{n} \sum_{k = 1}^n X_k - μ \right| \geqslant ε\right) = 0. $$ Therfore, $\displaystyle \frac{1}{n} \sum_{k = 1}^n X_k \Pto μ$.
The only way I know to prove the $L^p$-convergence for $p \geqslant 1$ is to use the convergence theorem for backward martingales, which seems to be out of the question. If you want to see this proof, I can add it here.