Double expected value

418 Views Asked by At

Let $m$ be a probability measure on $\mathbb{R}^n$, so that $m(\mathbb{R}^n) = 1$.

Consider two measurable functions $f: \mathbb{R}^n \times \mathbb{R}^n \rightarrow \mathbb{R}$, and $g : \mathbb{R}^n \rightarrow \mathbb{R}$. Assume that $f$ and $g$ are uniformly bounded by an integrable function.

I am wondering about the following inequality. $$ \mathbb{E}_w[\mathbb{E}_v[ f(v,w) - g(w) ]] \ \leq \ \mathbb{E}_w[ f(w,w) - g(w) ]$$

Comments. The expected value $\mathbb{E}$ is defined as follows. $\mathbb{E}_w[g(w)]:= \int_{\mathbb{R}^n} g(w) m(dw)$. Therefore $\displaystyle \mathbb{E}[\mathbb{E}[ f(v,w) - g(w) ]] = \int_{\mathbb{R}^n} \left( \int_{\mathbb{R}^n} \left( f(v,w) - g(w) \right) m(dv) \right) m(dw). $

1

There are 1 best solutions below

1
On BEST ANSWER

Although you can easily derive the answer from the perfect hint of Tim, let me provide some intuition why the inequality doesn't have to be true. Namely, let is imagine the idea of such inequality came to our mind for the first time, and let's examine this idea, so that the next time it would be easier for you to come with a count example by yourself.

  1. First of all, note that for the LHS $$ \iint(f(v,w) - g(v))m(\mathrm dw)m(\mathrm dv) = \iint f(v,w)m(\mathrm dw)m(\mathrm dv) - \int g(v)m(\mathrm dv) $$ and for the RHS $$ \int (f(w,w) - g(w))m(\mathrm dw) = \int f(w,w)m(\mathrm dw) - \int g(w)m(\mathrm dw) $$ and so LHS$-$RHS does not depend on $g$ at all. Thus, your inequality is equivalent to $$ \iint f(v,w)m(\mathrm dw)m(\mathrm dv) - \int f(w,w)m(\mathrm dw)\leq0 \tag{1}. $$

  2. Note that if $f$ satisfies $(1)$, then $-f$ shall also satisfy $(1)$ which by the claim holds at least for any bounded $f$. Thus, $(1)$ has to be even an equality for all bounded $f$. And the latter fact clearly does not have to be true: see below.

  3. Even regardless of the trick with inequality turning into an equality, critical glance at $(1)$ shall tell us that it does not have to be true. The point is that you can always decompose $f$ into a diagonal part and the off-diagonal one, that is let $$ \Delta\subset \Bbb R^n\times \Bbb R^n = \{(x,x):x\in \Bbb R^n\} $$ be the diagonal, then $f = 1_\Delta f_1 + 1_{\Delta^c}f_2$ and the decomposition is clearly unique: we just use $f_1$ for values of $f$ on the diagonal, and $f_2$ for everywhere else. As a result, $(1)$ turns into $$ \iint (1_\Delta f_1(w,v)+1_{\Delta^c}f_2(w,v))m(\mathrm dw)m(\mathrm dv)\leq \int f_1(w,w)m(\mathrm dw) $$ and if you choose $m$ with, say continuous density, then $m\otimes m(\Delta) = 0$, so $$ \iint f_2(w,v)m(\mathrm dw)m(\mathrm dv)\leq \int f_1(w,w)m(\mathrm dw). $$ Now, since $f_1$ and $f_2$ can be chosen in a totally free way, nothing actually tells us that the latter inequality is true, and you can easily come up with a counterexample. E.g. let $f_1$ be negative everywhere, and let $f_2$ be positive everywhere. This method of decomposing $f$ into two parts would also work for a lot of modifications of your inequality. Namely, if you would ask about $$ \Bbb E_w[\Bbb E_v[(f(v,w) - g(w))^2]]\leq \Bbb E_w[(f(w,w) - g(w))^2] $$ then points $1.$ and $2.$ are not applicable anymore. However, if you ask the latter inequality to holds for all bounded measurable functions, it has to hold at least for the case when $g = 0$, and then you can apply again the decomposition.