Expectation of $b^T \operatorname{sign}(Ab)$.

65 Views Asked by At

I'm trying to compute the expectation of:
$$b^T \operatorname{sign}(Ab)$$ Where $b$ is a $n\times1$ vector of independent Bernoulli random variables:
$$P(b_i = 1) = 0.5,\quad P(b_i = -1) = 0.5$$ and $A$ is a fixed $n\times n$ matrix.
I think this might be hard to compute, since we need to find the probabilities of:
$$v = \operatorname{sign}(Ab)\quad P(v_i > 0).$$

And $v$ and $b$ are correlated...
However, computing $v_i$ corresponds to computing the sum:
$$v_i = \sum_j A_{ij} b_i.$$

I think there must be some method to find all values of $v_i$ that are interesting (near zero), since you are summing the values of $A$. This way it might be possible to find the probability density of $sign(v)$.
However we will also need to take the correlation into account.

If it's not possible or straightforward to compute, does anyone know how to bound the expectation? Or possibly, I will just sample the $b$'s to compute an empirical expectation, however I would like to know if we can make some gaurantee that the empirical expectation is near the real expectation.

1

There are 1 best solutions below

0
On BEST ANSWER

Let us write $u_i^T=(0,\,\dots,0,\,1,\,0,\,\dots,\,0)$ the vector with the $i^{\text{th}}$ component equal to $1$ and the others to $0$. We have $u_i^Tu_j=\delta_{ij}$ (Kronecker symbol). We will also use the property $\operatorname{sign}(b_ix)=b_i\operatorname{sign}(x)$ if $b_i=\pm1$ and $\operatorname{sign}(xu_i)=\operatorname{sign}(x)u_i$. We write $b$ as $\sum_ib_iu_i$.

We have $$\begin{split}b^T\operatorname{sign}(Ab)&= \left(\sum_i b_iu_i^T\right)\operatorname{sign}\left(\sum_{jk}b_jA_{jk}u_k\right)\\&=\sum_{ik} b_iu_i^T\operatorname{sign}\left(\sum_jb_jA_{jk}\right)u_k\\&=\sum_ib_i\operatorname{sign}\left(\sum_jb_jA_{ji}\right)\\& =\sum_i\operatorname{sign}\left(\sum_j b_ib_jA_{ji}\right)\\& =\sum_i\operatorname{sign}\left(A_{ii}+\sum_{j\neq i}b_ib_jA_{ji}\right). \end{split}$$

Now let us consider the expectation of $\operatorname{sign}(x+\sum_jb_jy_j)$ where $x$ and $y_j$ are positive reals. If $x-\sum_jy_j>0$ then, the sign is always positive and $\mathbb{E}\left[\operatorname{sign}(x+\sum_jb_jy_j)\right]=1=\operatorname{sign}(x)$. If $x-\sum_jy_j<0$, the sign of $x+\sum_jb_jy_j$ is either positive or negative (or zero) and it is impossible to say more about its mean value. Clearly if $x<0$, the same rule applies with reverd signs. It is thus possible to say that if for all $i$, $|A_{ii}|>\left\lvert\sum_{j\neq i} A_{ji}\right\rvert$ then $$\mathbb{E}\left[b^T\operatorname{sign}(Ab)\right]=\sum_i\operatorname{sign}(A_{ii}),$$ otherwise, one can only state that $$\left\lvert\mathbb{E}\left[b^T\operatorname{sign}(Ab)\right]\right\rvert\le\left\lvert \sum_i\operatorname{sign}(A_{ii})\right\rvert.$$