Under what conditions do expected value and linear operator commute?

1.2k Views Asked by At

Let $X \in L^2[0,1]$ be some squared integrable random function. $(X_n)$ is a sequence of iid realizations of $X$ with $E [X_n] = 0 $ and $E [X_n^4] < \infty$.
$\Psi$ is supposed to be some linear operator $\Psi: L^2[0,1] \rightarrow L^2[0,1]$.

I want to show that $$ E\big[\Psi ( \langle X_n, x \rangle X_n)\big] = \Psi \big( E [\langle X_n, x \rangle X_n ] \big). $$ Note that $x$ is some arbitrary function in $L^2$ and $\langle a,b\rangle$ is the inner product $\int a(t) b(t) d t$.

To show the above is no problem if I also assume $\Psi$ to be an integral operator of form $\Psi (X) = \int \psi(s,t) X(s) d s$, but I would like to avoid this specific assumption.

Can I simply put the expected value inside the expression on which I apply the linear operator? Or are additional assumptions necessary?

Thanks for your help!

1

There are 1 best solutions below

6
On

Working on the assumption that you are using $X$ and $X_n$ interchangeably, the answer follows from a simple approximation argument.

Consider a random variable $X$ on a separable Hilbert space $\mathcal{H}$. (We can set $\mathcal{H} = L^2[0,1]$ in OP's case.) Also, denote by $\| \cdot\|$ the induced norm on $\mathcal{H}$.

  1. Assume first that $X$ is simple. That is, $X$ is of the form $\sum_{k=1}^{n} x_k \mathbf{1}_{E_k}$ for some $x_1, \cdots, x_n \in \mathcal{H}$ and disjoint measurable sets $E_1, \cdots, E_n$. Then for any bounded linear operator $\Psi : \mathcal{H} \to \mathcal{H}$,

    $$ \mathbb{E} \Psi(X) = \mathbb{E} \left[ \sum_{k=1}^{n} \Psi(x_k) \mathbf{1}_{E_k} \right] = \sum_{k=1}^{n} \Psi(x_k) \mathbb{P}(E_k) = \Psi\left[ \sum_{k=1}^{n} x_k \mathbb{P}(E_k) \right] = \Psi(\mathbb{E} X). $$

  2. For a general integrable random variable $X$ on $\mathcal{H}$, separability of $\mathcal{H}$ allows us to construct a sequence $(X_n)$ of simple random variables on $\mathcal{H}$ such that $X_n \to X$ in $\mathcal{H}$ and $\|X_n\| \leq \|X\|$ almost surely. Then we also have $\Psi(X_n) \to \Psi(X)$ and $\|\Psi(X_n)\| \leq \|\Psi\|\|X\|$ almost surely. So we can apply the dominated convergence theorem to obtain

    $$ \mathbb{E}\Psi(X) \stackrel{\text{DCT}}{=} \lim_{n\to\infty} \mathbb{E}\Psi(X_n) = \lim_{n\to\infty} \Psi(\mathbb{E}X_n) \stackrel{\text{DCT}}{=} \Psi(\mathbb{E}X). $$

  3. Now, if we assume that $X$ is square-integrable in the sense that $\mathbb{E}[\|X\|^2] < \infty$, then for each $x \in \mathcal{H}$ the random variable $\langle X, x\rangle X$ is integrable. So we have

    $$ \mathbb{E}[\Psi(\langle X, x \rangle X)] = \Psi(\mathbb{E}[\langle X, x\rangle X]). $$

Not surprisingly, a very general statement continues to hold. You may see here, for instance.