Prove the monotonicity of the expectation of a binary random variable function

1.4k Views Asked by At

Consider $R$ independent binary random variables $y^1, \ldots, y^R$ over the space $\{-1, +1\}$ such that $\Pr(y^j = 1) = p^j \geq 0.5$ and $\Pr(y^j = -1) = 1 - p^j$, $\forall j = 1,\ldots,R$. Moreover, let $\mathcal{J}$ denote a subset of $\mathcal{Y} = \{1, \ldots, R\}$, i.e., $\mathcal{J} \subseteq \mathcal{Y}$. I would like to prove that:

$f(\mathcal{J} \cup \{j'\}) \geq f(\mathcal{J}) \quad \forall \mathcal{J} \in 2^{\mathcal{Y}}, \forall j' \in \mathcal{Y} \setminus \mathcal{J}$

where the function $f( \cdot )$ is defined as follows:

$f(\mathcal{J}) = E\left(| \sum_{j \in \mathcal{J}} y^j |\right)$

In other words, I want to verify that the afore-mentioned expectation is monotonically non decreasing in the set $\mathcal{J}$. Thanks.

EDIT: I posed a new question about the same proof in case $f(\mathcal{J})$ is altered so that $f(\mathcal{J}) = E\left(| \sum_{j \in \mathcal{J}} logit(p^j) y^j |\right)$. The new question can be found here: Where is the fallacy in this coupling argument of two Bernoulli variables?

2

There are 2 best solutions below

9
On BEST ANSWER

The result holds. By induction on $R$, it suffices to show that if $X=Y_1+\cdots+Y_n$ for independent $\pm1$ Bernoulli random variables $(Y_k)$ with $P[Y_k=1]=p_k\geqslant\frac12$, and if $Y$ is a $\pm1$ Bernoulli random variable independent of $(Y_k)$ with $P[Y=1]=p\geqslant\frac12$, then $$ E[|X+Y|]\geqslant E[|X|]. $$ Note that $E[|X+Y|]=pE[|X+1|]+(1-p)E[|X-1|]$ and that, for every integer $x$, $$ p|x+1|+(1-p)|x-1|-|x|=\mathbf 1_{x=0}+(2p-1)(\mathbf 1_{x\gt0}-\mathbf 1_{x\lt0}). $$ Since $2p-1\geqslant0$ and $X$ is integer valued, $E[|X+Y|]\geqslant E[|X|]$ as soon as $$ P[X\gt0]\geqslant P[X\lt0]. $$ The last inequality above is not a priori obvious, fortunately it stems from a coupling argument.

To wit, one can assume that, for each $k$, $Y_k=2\mathbf 1_{U_k\leqslant p_k}-1$ where the sequence $(U_k)$ is i.i.d. uniform on the interval $(0,1)$. Let $Y^0_k=2\mathbf 1_{U_k\leqslant\frac12}-1$, then $Y_k^0\leqslant Y_k$ almost surely and the sequence $(Y^0_k)$ is i.i.d. symmetric Bernoulli. In particular, the distribution of $X^0=Y^0_1+\cdots+Y^0_k$ is symmetric hence $P[X^0\gt0]=P[X^0\lt0]$.

Finally, $X^0\leqslant X$ almost surely hence $[X^0\gt0]\subseteq[X\gt0]$ and $[X\lt0]\subseteq[X^0\lt0]$. Thus, $P[X\gt0]\geqslant P[X^0\gt0]=P[X^0\lt0]\geqslant P[X\lt0]$, and we are done.

7
On

wlog $\mathcal T = 1,...,n,.X = \sum^n Y_j, j^{\prime} = n+1$, and that is wlog because you can always go from a smaller set to a larger adding one index at a time, and just number them in the order in which you want to add. First use Jensen's inequalty/ submartingales to show that if you add a symmetric $Y_{n+1} $ the expected value of the absolute value increases. Then use the following argument to show that if $\mathbb P (X > 0) > \mathbb P ( X < 0), \mathbb E(|X + Y_{n+1}|) $ is an increasing function of $p_{n+1}$

let $X$ be integer valued, $Y$ be $\pm 1$ with $\mathbb P(Y=1) = p$. $\mathbb E (|X + Y|) = \mathbb P (X=0) + \sum_{k > 0} \mathbb P(X = k)((k+1) p + (k-1) (1-p)+ \mathbb P(X = -k)((k+1) (1-p) +( k-1)p)$.

$\frac d {dp}$ of that is something like $ 2 \sum_{k > 0} (\mathbb P (X=k) - \mathbb P(X=-k))$ which is positive in your case. Also, if $p_j = \frac 12$ the expectation is larger by the fact that $|X|, |X+Y_j|$ is a submartingale.