Is this set of random variables a Hilbert space?

967 Views Asked by At

Consider a sequence of i.i.d. random variables $\left\{ {{\varepsilon _t}} \right\}_{t = 1}^\infty $ with $E\left( {{\varepsilon _t}} \right) = 0$ and $E\left( {\varepsilon _t^2} \right) = {\sigma ^2} < \infty $ and denote by ${\varepsilon ^t} = ({\varepsilon _{1,}}{\varepsilon _{2,}} \ldots {\varepsilon _t})$ the history of the process up to and including period $t$. Let $0 < \beta < 1$. Define P as the set of all ${R^\infty }{\rm{ - valued}}$ functions $x(\varepsilon ) = \left\{{{x_t}({\varepsilon ^t})} \right\}_{t = 1}^\infty$ such that $\sum\limits_{t = 1}^\infty {{\beta ^t}x_t^2} \mathop < \limits^{a.s.} \infty $ and ${E_{t = 0}}\sum\limits_{t = 0}^\infty {{\beta ^t}x_t^2} < \infty $ exist.

For meaning/intuition: ${x_t} = {x_t}({\varepsilon ^t})$ are decision rules that can depend only on information ${\varepsilon ^t}$ available at time $t$.

I have the following question: Is P a Hilbert space with the product $\langle x,y\rangle = {E_{t = 0}}\left( {\sum\limits_{t = 1}^\infty {{\beta ^t}{x_t}{y_t}} } \right)$ and associated norm $$\left\| x \right\| = {\langle x,x\rangle ^{1/2}} = {\left( {{E_{t = 0}}\left( {\sum\limits_{t = 0}^\infty {{\beta ^t}x_t^2} } \right)} \right)^{1/2}}$$?

(By ${E_{t = 0}}$ I mean the expectation at t=0, before any information on the ${\varepsilon _t}$ is available. By a.s. I mean almost surely.)

I guess the tricky part is to prove that P is complete (with the norm (is it a norm?) just described), and perhaps, that if $\left\| x \right\| = 0$ then $x\mathop= \limits^{a.s.} 0$ ? I will be very grateful for any suggestions or references. I am not a mathematician, so even steps that may seem elementary to you would help me.

1

There are 1 best solutions below

2
On BEST ANSWER

Yes, it is.

Here is an outline. There are quite a few details to fill in, however, so you may have to brush up your functional analysis.

Lemma 1. If $(\Omega,\mu)$ is any measure space, and $H$ is any separable Hilbert space, let $L^2(\Omega, \mu; H)$ be the space of Borel measurable functions $f : \Omega \to H$ such that $\|f\|^2_{L^2(\Omega,\mu;H)} := \int_\Omega \|f(\omega)\|_H^2\,\mu(d\omega) < \infty$, equipped with the inner product $\langle f,g\rangle_{L^2(\Omega,\mu;H)} = \int_\Omega \langle f(\omega), g(\omega) \rangle_H \mu(d\omega)$. (As usual we identify functions which are almost everywhere equal.) This is a Hilbert space.

The only hard part is completeness, and the proof is pretty much the same as the proof that ordinary $L^2$ spaces are complete. Take a Cauchy sequence $\{f_n\}$ and pass to a subsequence $\{f_{n_k}\}$so that $\|f_{n_k} - f_{n_{k+1}}\|_{L^2(\Omega,\mu;H)}^2 < 2^{-k}$. Use a Borel-Cantelli argument to show that $\{f_{n_k}(\omega)\}$ is a Cauchy sequence in $H$ for almost every $\omega$. By completeness of $H$, $\{f_{n_k}\}$ converges almost everywhere; call the limit $f$. Then use the triangle inequality to show that in fact $f_n \to f$ in $L^2(\Omega,\mu;H)$-norm.

Now take $H$ to be the Hilbert space of all real sequences $\{a_i\}$ with $\sum_i \beta^i a_i^2 < \infty$. (This is a Hilbert space because it is $L^2(\mathbb{N}, \nu)$ where $\nu$ is the measure on $\mathbb{N}$ such that $\nu(A) = \sum_{i \in A} \beta^i$.) Take $(\Omega, \mu)$ to be the probability space $(\Omega, \mathbb{P})$ on which your random variables $\varepsilon_t$ are defined. Then note that all your functions $x \in P$ can be viewed as elements of $L^2(\Omega,\mathbb{P};H)$. (There is a little bit of work to do to verify that they correspond to measurable functions from $\Omega$ into $H$, with respect to the Borel $\sigma$-algebra on $H$.)

So we have $P$ identified as a linear subspace of the Hilbert space $L^2(\Omega,\mathbb{P}; H)$. We now need to show it is closed.

Here are a couple more lemmas:

Lemma 2. If $(\Omega, \mathcal{F}, \mathbb{P})$ is a probability space and $\mathcal{G} \subset \mathcal{F}$ is a sub-$\sigma$-field, consider the subspace $L^2(\Omega, \mathcal{G},\mathbb{P}) \subset L^2(\Omega,\mathbb{P})$ consisting of all those $L^2$ (real valued) random variables which are $\mathcal{G}$-measurable. It is a closed subspace.

Proof sketch. Given a sequence $X_n$ in $L^2(\Omega, \mathcal{G}, \mathbb{P})$ converging in $L^2$ to some $X \in L^2(\Omega, \mathbb{P})$, pass to a subsequence so that the convergence is almost sure. Almost sure convergence preserves measurability, so in fact $X$ is $\mathcal{G}$-measurable.

Lemma 3. (Doob-Dynkin lemma) Let $\mathcal{G}_n = \sigma(\varepsilon_1, \dots, \varepsilon_n)$. A real-valued random variable $X$ is $\mathcal{G}_n$-measurable iff there exists a Borel function $f : \mathbb{R}^n \to \mathbb{R}$ such that $X = f(\varepsilon_1, \dots, \varepsilon_n)$.

Proof sketch. One direction is immediate. For the other direction, first consider the case $X = 1_A$ where $A \in \mathcal{G}_n$. Then consider simple functions, nonnegative measurable functions, etc. This lemma can be found in most textbooks.

Now for each $t$, consider the map $\pi_t : L^2(\Omega,\mathbb{P};H) \to L^2(\Omega,\mathbb{P})$ defined by $\pi_t(x) = x^t$, i.e. it picks out the $t$ coordinate of $x$. Verify that $\pi_t$ is a bounded linear operator, hence continuous. So $E_t := \pi_t^{-1}(L^2(\Omega, \mathbb{G}_t, \mathbb{P}))$ is a closed subspace of $L^2(\Omega, \mathbb{P}; H)$. This is the space of $x$ such that $x^t$ is $\mathcal{G}_t$-measurable, which by Doob-Dynkin means it is a function of $(\varepsilon_1, \dots, \varepsilon_t)$.

Finally, verify that $P = \bigcap_{t=1}^\infty E_t$. Any intersection of closed sets is closed.


Having written this, I think it is actually overkill; you could apply the "completeness of $L^2$" argument to show directly that $P$ is complete. You'll still need something like Doob-Dynkin to show that the limit $x$ of your a.s.-converging subsequence is still in $P$, i.e. that $x^t$ can still be written as a function of $(\varepsilon_1, \dots, \varepsilon_t)$. Well, I will leave this to someone else to fill in if they would like.