Let $(\Omega, \mathbb{P}, \mathcal{F})$ be a probability space, and let $\mathbb{E}$ denote the expected value operator.
Consider the random variables $f: \Omega \rightarrow \{0,1,2\}$ and $g: \Omega \rightarrow [0,1]$, where $\Omega \subseteq \mathbb{R}^n$.
Consider the expected value of the product, namely $$ \mathbb{E}\left[ f(\cdot) g(\cdot) \right] := \int_{\Omega} f(\omega) g(\omega) \mathbb{P}(d \omega). $$
Now if $f$ and $g$ are independent random variables, then $\mathbb{E}[f(\cdot) g(\cdot)] = \mathbb{E}[f(\cdot)] \mathbb{E}[g(\cdot)]$.
I am wondering about conditions under which $\mathbb{E}[f(\cdot) g(\cdot)] \geq \mathbb{E}[f(\cdot)] \mathbb{E}[g(\cdot)]$.
Additional Assumption: $\mathbb{E}[g(\cdot)] = \frac{1}{2}\mathbb{E}[f(\cdot)]$.
A sufficient condition is that, for every $\omega$ and $\omega'$ in $\Omega$, $$ (f(\omega)-f(\omega'))\cdot(g(\omega)-g(\omega'))\geqslant0. $$