Inequality of conditioned expectation including geometric distributions

111 Views Asked by At

Let $Y$ be any $\mathbb{N}_0$-valued random variable. Further, let $0 \leq s \leq t < 1$ and $X^{(s)}$ have geometric law $\text{Geo}(1-s)$ and $X^{(t)}$ geometric law $\text{Geo}(1-t)$, i.e. $$ P(X^{(s)} = k) = (1-s)s^{k-1} $$ for $k \geq 1$. Assume the random variables $Y, X^{(s)}, X^{(t)}$ to be independent.

I am thinking about whether the following holds true: $$ E[ Y \mid Y \geq X^{(s)} ] \leq E [ Y \mid Y \geq X^{(t)}]$$ for all $0 \leq s \leq t < 1$.

I have the suspicion that this holds true independently of the law of $Y$ or maybe only, if $Y$ has infinite support. My intuition is that for $k$ large, $E[ Y \mid Y \geq k ]$ is close to $k$ and $EX^{(s)} \leq E X^{(t)}$. But it might indeed be false, then I am interested in a counter example or any ideas about what restriction have to be made.

1

There are 1 best solutions below

4
On

$\def\N{\mathbb{N}}\def\peq{\mathrel{\phantom{=}}{}}$First, $E(Y \mid Y \geqslant X(s)) = \dfrac{E(Y I_{\{Y \geqslant X(s)\}})}{P(Y \geqslant X(s))}$. Since\begin{gather*} P(Y \geqslant X(s)) = \sum_{n = 1}^∞ \sum_{k = 1}^n P(Y = n,\ X(s) = k) = \sum_{n = 1}^∞ \sum_{k = 1}^n P(Y = n) P(X(s) = k)\\ = \sum_{n = 1}^∞ P(Y = n) \sum_{k = 1}^n P(X(s) = k) = \sum_{n = 1}^∞ (1 - s^n) P(Y = n), \end{gather*}\begin{gather*} E(Y I_{\{Y \geqslant X(s)\}}) = \sum_{n = 1}^∞ \sum_{k = 1}^n n P(Y = n,\ X(s) = k) = \sum_{n = 1}^∞ \sum_{k = 1}^n n P(Y = n) P(X(s) = k)\\ = \sum_{n = 1}^∞ n P(Y = n) \sum_{k = 1}^n P(X(s) = k) = \sum_{n = 1}^∞ n (1 - s^n) P(Y = n), \end{gather*} then$$ E(Y \mid Y \geqslant X(s)) = \frac{\sum\limits_{n = 1}^∞ n (1 - s^n) P(Y = n)}{\sum\limits_{n = 1}^∞ (1 - s^n) P(Y = n)}, $$ and analogously,$$ E(Y \mid Y \geqslant X(t)) = \frac{\sum\limits_{n = 1}^∞ n (1 - t^n) P(Y = n)}{\sum\limits_{n = 1}^∞ (1 - t^n) P(Y = n)}. $$ Now the inequality is proved in this answer to Monotonicity of a fraction combined with series (related to probability distributions).