Expectations, Double Integrals and Jensen's Inequality

270 Views Asked by At

Consider two random variables distributed $v\backsim G(.)$ and $c \backsim F(.)$ with pdfs $g(.)$ and $f(.)$. Let the supports of $c$ and $v$ be $[x,y]$. Let $x<a=E(v)<b<y$, so $[a,b]\subset\lbrack x,y]$. Now consider a strictly concave (twice differentiable and continuous) function $u(.)$, with $u^{\prime}(.)>0$, $u^{^{\prime\prime}}(.)<0$, and $u(0)=0$ (passes through the origin). Establish sufficient conditions such that the expression $\int_{a}^{b}u(E(v)-c)f(c)dc-\int_{a}^{b}\int_{x}^{y}% u(E(v)-v)g(v)f(c)dvdc\geq0$ $\forall$ $v,c$, where $E(v)=\int_{x}^{y}vg(v)dv.$

Things I've tried:

  1. $\int_{0}^{\bar{v}}u(E(v)-v)g(v)dv\leq0$ by Jensen's inequality. To see this, let $E(v)-v=t$. But $E(t)=E_{v}[E(v)-v]=0$, and so $E(u(t))\leq u(E(t))=0$, since $u(0)=0$ by assumption.

  2. Clearly, $\int_{a}^{b}u(E(v)-c)f(c)dc\leq0$, since we are integrating the integrand $(E(v)-c)$ from $a=E(v)$ to $b$.

  3. Intuitively, a variant of Jensen's inequality should apply if $c$ and $v$ are i.i.d. Let $c$ and $v$ be i.i.d. with identical supports. Then the integrands are the same, and we have the expression $\int_{a}^{b}% u(E(v)-v)f(c)dc-\int_{a}^{b}\int_{x}^{y}u(E(v)-v)g(v)f(c)dvdc$. However, we can't apply Jensen's inequality directly since $\int_{a}^{b}u(E(v)-v)f(c)dc$ is not $u(E(x))$, even if we "factor out" the outer integrals. $\int_{x}% ^{y}u(.)g(v)dv$ seems to be a form of $E(u(x))$.

At a loss as to what to do here. Any help would be greatly appreciated. Thank you!

1

There are 1 best solutions below

2
On

Starting with $E_g[v] := \int_{x}^{y}vg(v) = a$, and using only the linearity of integration we get $$\Delta = \int_{a}^{b} \left[u(a-c) - \int_{x}^{y}u(a-v)g(v)\,dv\right] f(c) \, dc.$$ Let us define the following operator: $E_f^{[w, z]}[h] := \int_{w}^z h(x)f(x)\,dx$. This inherits many of the properties of the usual expectation operator. In particular, if we write $K = \int_{x}^{y}u(a-v)g(v)\,dv$, we get $$ \Delta = E_f^{[a, b]}[u(a-c) - K] = E_f^{[a, b]}[u(a-c)] - E_f^{[a, b]}[K] = E_f^{[a, b]}[u(a-c)] - [F(b)-F(a)]K. $$

So to have $\Delta \geq 0$ we need $$ \frac{1}{[F(b)-F(a)]}\int_{a}^{b} u(a-c)f(c)\,dc \geq \int_{x}^{y}u(a-v)g(v)\,dv. $$

Since we know from your own work that $E_f^{[a, b]}[u(a-c)] \leq 0$ we can also write $$ \int_{a}^{b} u(a-c)f(c)\,dc \geq \int_{x}^{y}u(a-v)g(v)\,dv, $$ because $0 < F(b)-F(a) < 1$. If we let $F \equiv G$ we arrive at

$$ \int_{a}^{b} u(a-c)f(c)\,dc \geq \int_{x}^{a}u(a-c)f(c)\,dc + \int_{a}^{b}u(a-c)f(c)\,dc + \int_{b}^{y}u(a-c)f(c)\,dc, $$

$$ \int_{x}^{a}u(a-c)f(c)\,dc + \int_{b}^{y}u(a-c)f(c)\,dc \leq 0, $$ which need not always hold true.