With respect to the scenario introduced in Prove the monotonicity of the expectation of a binary random variable function, let us now suppose that the function:
$$\begin{align*} f(\mathcal{J}) = E\left(| \sum_{j \in \mathcal{J}} y^j |\right) \end{align*}$$ is altered so that $f(\mathcal{J}) = E\left(| \sum_{j \in \mathcal{J}} \text{logit}(p^j) y^j |\right)$, where $\text{logit}(p^j) = \log(\frac{p^j}{1 - p^j})$. Moreover, let us now alter the definition of $X$ from $X = Y^1 + \ldots + Y^N$ to $X = \text{logit}(p^1)Y^1 + \ldots + \text{logit}(p^N)Y^N$.
Hence, the proof provided in the best answer for the previous question needs to be adapted to the new definitions.
Specifically, the coupling argument introduced in the final step of the proof does not hold anymore, since $\text{logit}(0.5) = 0$ and $X^0$ reduces to a deterministic value. Thus, a different way of prooving that $P(X > 0) > P(X < 0)$ needs to be devised.
I will now introduce a tentative proof. Although the proof seems correct from a formal point of view, it does not look correct from an intuitive view point.
To this end, let us rewrite $P(X < 0)$ as $P(-X > 0)$. The random variable $-X$ can be expressed as:
$$\begin{align*} -X = \text{logit}(p^1)(-Y^1) + \ldots + \text{logit}(p^N)(-Y^N) \end{align*}$$
Similarly to the proof for the previous question, let us now express $Y^j$ as $Y^j = 2\mathbf{1}_{U^j \leq p^j} -1$ where $U^j \sim U(0,1)$ . Since $Y^j$ is a symmetric Bernoulli random variable taking value in {-1,1}, we can express $-Y^j$ as $-Y^j = 2\mathbf{1}_{U^j \leq (1-p^j)} - 1$. Since $p^j \geq 0.5$, $\forall j = 1, \ldots, R$ we have that $-Y^j \leq Y^j$ almost surely. Consequently $-X \leq X$ almost surely and the claim is verified.
The afore-mentioned proof does not look correct. It is sufficient to notice that according to the definition of $Y^j$ and $-Y^{j}$ it is possible to have $Y^j = 1$ and $-Y^j = 1$ at the same time.
What am I doing wrong? Moreover, how to correctly verify the claim? thanks.
A mistake is that $-Y_j$ is not what you write but rather $-Y^j = 2\mathbf{1}_{U^j \gt p^j} - 1$. Of course, $-Y^j$ has the same distribution as $2\mathbf{1}_{U^j \leqslant1-p^j} - 1$ since $U^j$ is uniform on $\{-1,1\}$ hence the distributions of $-U^j$ and $U_j$ coincide, but the pointwise identity in your post fails.
More generally, note that $-X\leqslant X$ almost surely is logically equivalent to $X\geqslant0$ almost surely. And in your setting, $[X\lt0]$ has positive probability...
What happens here is that, simply because $P(Y^j=+1)\geqslant\frac12$ for every $j$, each random variable $-Y^j$ is stochastically dominated by $Y^j$ hence (possibly enlarging the sample space) there exists some independent random variables $(Y'_j)$ such that each $Y'_j$ is distributed like $-Y^j$ and $Y'_j\leqslant Y^j$ almost surely. Then, every $\text{logit}(p^j)$ being nonnegative, $X'=\sum\limits_j\text{logit}(p^j)Y'_j$ is such that $X'$ is distributed like $-X$ and $X'\leqslant X$ almost surely. Thus, $[X\lt0]$ has the same probability as $[X'\gt0]$ and $[X'\gt 0]\subseteq[X\gt0]$. Together, these two assertions imply that $P(X\lt0)\lt P(X\gt0)$.