Inner product and norms for random vectors

3.6k Views Asked by At

From wikipedia inner product page: the expected value of product of two random variables is an inner product $\langle X,Y \rangle = \operatorname{E}(X Y)$. How it can be generalized in case of random vectors?

Or more generally for any probability measure. Let $\mathbb{P}$ be a set of all probability measures defined on $X$, and let $\mathbb{M}$ be the linear span of $\mathbb{P} - \mathbb{P}$. How an inner product can be defined on $\mathbb{M} \times \mathbb{M}$?

I've looked to the norm like $$\|P - Q\|= \sup_{f} \left| \int f \, dP - \int f \, dQ \right|$$ But it seems that this norm doesn't satisfy the parallelogram law (so $\langle x, y\rangle = \frac{1}{4}( \|x + y\|^{2} - \|x - y\|^{2})$ trick cannot be used). Is it possible to proof this?

1

There are 1 best solutions below

2
On

$\mathbb M$ would be the space of signed measures on $X$ (presumably with respect to a particular $\sigma$-algebra). This is a Banach space with the total-variation norm, but not a Hilbert space, and so it doesn't have a natural inner product.