I'm working on the probabilistic space of square random variables ($L^2 (Ω, F, P)$). I need to prove that it is a Hibert Space (the exercise doesn't say with which norm). The usual inner product is $<X,Y>=\mathbb{E}[XY]$, it verifies the necessary properties. So I decide to consider the norm $\lVert X\rVert_2=\sqrt{\mathbb{E}[X^2]}$.
So I know that I need to prove that this space is complete with relation to that norm, and that's what I can't seem to do. I need to consider a Cauchy sequence $\{X_n\}$ in $L^2 (Ω, F, P)$. So for every $\epsilon>0$, there is a N such that for all $n,m>N$ we have : $\lVert X_n-X_m\rVert_2=\sqrt {\mathbb{E}[(X_n-X_m)^2]}\leq \varepsilon$.
I need to prove that there is a X in $L^2 (Ω, F, P)$ such that $\lim_{n\to\infty}\lVert X_n-X\rVert_2=0$
And now from there I don't really know where to go, I saw things about the dominated convergence theorem but I'm unsure about how to use it in this siuation. I hope my question is not too dumb.
First of all, the correct inner product depends on whether you're considering real or complex random variables. If you're looking at complex RV's, then the inner product you want is $\left< X, Y \right> : = \mathbf{E} \left[ X \overline{Y} \right]$, but if you're looking at real RV's, then $\mathbf{E}[XY]$ is fine. The norm is then $\left\| X \right\| : = \sqrt{\left< X, X \right>} = \mathbf{E} \left[ |X|^2 \right]^{1/2}$. Either way, the distinction won't make a difference in what follows. I should also note that what I'm about to write out can be found in basically any measure theory text, since any introductory measure theory text will prove that $L^p$ spaces are complete.
Let $(X_n)_{n = 1}^\infty$ be a Cauchy sequence in $L^2$. I'm going to prove that an appropriately chosen subsequence of $(X_n)_{n = 1}^\infty$ converges in $L^2$. This will then imply that $(X_n)_{n = 1}^\infty$ is convergent. I'll prove this at the end, but you can also take it as a worthwhile exercise.
Because $(X_n)_{n = 1}^\infty$ is Cauchy, for each $k \in \mathbb{N}$, there exists $n_k \in \mathbb{N}$ such that if $n \geq n_k$, then $\|X_n - X_{n_k}\| \leq 4^{-k}$. We can assume without loss of generality that $n_1 < n_2 < n_3 < \cdots$. We then have in particular that $\|X_{n_{k + 1}} - X_{n_k}\| \leq 4^{-k}$. I'm going to prove that $(X_{n_k})_{k = 1}^\infty$ converges in $L^2$. In fact, I'm going to do this by proving a slightly different claim: that $(X_{n_k})_{k = 1}^\infty$ converges pointwise, meaning I can use Fatou's Lemma. For convenience, let's write $Y_k = X_{n_{k + 1}} - X_{n_k}$.
For each $k \in \mathbb{N}$, set $$E_k = \left\{ \omega : |Y_k(\omega)| \geq 2^{-k} \right\} = \left\{ \omega : |Y_k(\omega)|^2 \geq 4^{-k} \right\} .$$ By the Chebyshev Inequality, we have that $P(E_k) \leq 4^k \mathbf{E} \left[|Y_k|^2 \right] = 4^k \|Y_k\|^2 \leq 4^k / 16^k = 4^{-k}$. Therefore $\sum_{k = 1}^\infty P(E_k) < \infty$. By the Borel-Cantelli Lemma, it follows that $$P \left( \left\{ \omega \in \Omega : |Y_k(\omega)| \geq 2^{-k} \textrm{ for infinitely many $k \in \mathbb{N}$} \right\} \right) = 0 .$$ This means that for almost all $\omega \in \Omega$, we have that $|Y_k(\omega)| < 2^{-k}$ for sufficiently large $k \in \mathbb{N}$. Write $$F = \left\{ \omega \in \Omega : \exists K \in \mathbb{N} \; \forall k \geq K \; \left( \left| Y_k(\omega) \right| < 2^{-k} \right) \right\} .$$ If $\omega \in F$ (which is almost all $\omega \in \Omega$), then \begin{align*} X_{n_k}(\omega) = X_{n_1}(\omega) + \sum_{j = 1}^{k - 1} Y_j(\omega) , \end{align*} and by the comparison test, this series converges as $k \to \infty$. Therefore $(X_{n_k}(\omega))_{k = 1}^\infty$ converges for all $\omega \in F$, i.e. almost everywhere.
Now write $Z(\omega) = \lim_{k \to \infty} X_{n_k}(\omega)$ for all $\omega \in F$. Since $\Omega \setminus F$ has probability $0$, we can set $Z$ to whatever we want there. Fatou's Lemma then tells us that \begin{align*} \mathbb{E} \left[ |Z|^2 \right]^{1/2} & \leq \liminf_{k \to \infty} \mathbf{E} \left[ |X_{n_k}|^2 \right]^{1/2} \\ & = \liminf_{k \to \infty} \left( \mathbf{E} \left[ \left| X_{n_1} + \sum_{j = 1}^{k - 1} Y_j \right|^2 \right]^{1/2} \right) \\ & = \liminf_{k \to \infty}\left( \left\| X_{n_1} + \sum_{j = 1}^{k - 1} Y_j \right\| \right)\\ & \leq \liminf_{k \to \infty} \left( \|X_{n_1}\| + \sum_{j = 1}^{k - 1} \| Y_j \| \right) \\ & \leq \liminf_{k \to \infty} \left( \|X_{n_1}\| + \sum_{j = 1}^{k - 1} 4^{-j} \right) \\ & = \|X_{n_1}\| + \sum_{j = 1}^{\infty} 4^{-j} \\ & < \infty . \end{align*} Therefore $Z \in L^2$, meaning that $(X_{n_k})_{k = 1}^\infty$ converges to a limit $Z$ in $L^2$. Finally, to prove that this is the limit of $(X_n)_{n = 1}^\infty$, let $\epsilon > 0$. Choose $N \in \mathbb{N}$ such that if $n \geq N$, then $\|X_n - X_N\| \leq \epsilon / 3$. Since $X_{n_k} \to Z$, there exists $K \in \mathbb{N}$ such that if $k \geq K$, then $\|X_{n_k} - Z\| \leq \epsilon / 3$. Assume without loss of generality that $n_K \geq N$. Then for all $n \geq n_K$, we have that $\|Z - X_n\| \leq \|Z - X_{n_K} \| + \|X_{n_K} - X_N\| + \|X_N - x_n\| \leq \epsilon$, where the estimates on $\|X_{n_K} - X_N\|, |X_N - x_n\|$ follow from the fact that $n_K,n \geq N$.