Is $\mathbb{E}[X|X<Y]$ finite if $\mathbb{E}[Y]<\infty$?

117 Views Asked by At

I'm afraid I may be overlooking an obvious answer to this question, but perhaps someone can provide some assistance, as probability is not my area of expertise.

Suppose we have two independent random variables, $X$ and $Y$, with $X$ finite almost surely, but $\mathbb{E}[X]=\infty$ and $\mathbb{E}[Y]<\infty$. I'm trying to understand the quantity $\mathbb{E}[X\mid X<Y]$. In particular, I would like to know if this conditional expectation is finite.

I feel like I should be able to say $\mathbb{E}[X\mid X<Y] < \mathbb{E}[Y]$, which gives the result, but then I got a bit caught up in the details.

Any help, even just a nudge in the right direction, would be greatly appreciated.

Edit: so sorry to have left this out, but $X$ and $Y$ are non-negative RV.

4

There are 4 best solutions below

4
On BEST ANSWER

Since we're not supposed to answer in comments, I'll write this out. Note that this is an example where the measure theoretic definition of conditional probability is too restrictive and we need to work with the classical definition. Since there seems to be a bit of confusion here, I'll write out more details than I normally would.

On our original probability space $(\Omega,\mathcal{F})$ for $B \in \mathcal{F}$ and $\omega \in \Omega$, define the indicator function of a measurable set $B$ to be

$$ 1_B(\omega) = \begin{cases} 1 & \omega \in B \\ 0 & \omega \notin B \end{cases}. $$

We have $\mathbb{P}(X\geq0) = 1,$ $\mathbb{P}(Y\geq0) = 1$, $\mathbb{P}(Y>X)>0$, and $\mathbb{E}[Y]<\infty$. We want to show that under $\tilde{\mathbb{P}}(\cdot) = \mathbb{P}(\cdot|X<Y)$ (i.e. on the new space $(\Omega,\mathcal{F},\tilde{\mathbb{P}}))$ $X$ will have a finite mean. I wrote it in this way because it is helpful to think of a classical conditional probability as being a completely different probability measure on the original space. It shouldn't be surprising that changing the reference measure can change some integrability properties.

We have

$$ \tilde{\mathbb{P}}(A) = \mathbb{P}(A|X<Y) = \frac{\mathbb{P}(A \cap \{X<Y\})}{\mathbb{P}(X<Y)} = \frac{\mathbb{E}[1_A1_{\{X<Y\}}]}{\mathbb{P}(X<Y)}. $$ The conditional expectation is similarly given by $$ \tilde{\mathbb{E}}[X] = \mathbb{E}[X|X<Y] = \frac{\mathbb{E}[X1_{\{X<Y\}}]}{\mathbb{P}(X<Y)} $$ Since $Y \geq 0$ and $X \geq 0$ we have almost surely $$ 0 \leq X1_{\{X<Y\}} \leq Y 1_{\{X < Y\}} \leq Y $$ Taking expectations, we see that $$ 0 \leq \mathbb{E}[X1_{\{X<Y\}}] \leq \mathbb{E}[Y1_{\{X<Y\}}] \leq \mathbb{E}[Y]. $$ Dividing by $\mathbb{P}(X<Y)$, we see that $$ 0 \leq \mathbb{E}[X|X<Y] \leq \mathbb{E}[Y|X<Y] \leq \frac{\mathbb{E}[Y]}{\mathbb{P}(X<Y)}. $$ Now, you might be wondering why we picked up that extra factor of $\mathbb{P}(X<Y)^{-1}$ at the end. Intuitively, we are working on the event where $Y$ is larger than something we know can be quite large (since $\mathbb{E}[X] = \infty$ in your original statement). We should not be surprised that $Y$ is typically quite large relative to its usual size if it is larger than something which has an unconditionally infinite mean.

3
On

Not necessarily. Let $X$ be a random variable satisfying $\mathbb{E}[X] = \infty$; $\mathbb{E}[X|X<0] = -\infty$; $X$ almost surely finite. Then let $Y$ be the rv that is uniformly 0 everywhere. Then $\mathbb{E}[X| X < Y]=-\infty$. Not finite.

An example of such an $X$ would be a random variable that satisfies $X=n$ with probability $\frac{c}{n^2}$; $X=-n$ with probability $\frac{c}{2n^2}$ for each $n \in \mathbb{N}$, some $c$ so that the probabilities $\sum_{n \in \mathbb{N}}[\frac{c}{n^2} -\frac{c}{2n^2}]$ add to 1.

0
On

I think that it is not enough to guarantee the claim. This is because if $Y$ is a non-negative random variable and $X$ takes values over all reals, we could construct $X$ such that

$$\int_{\{X\le0\}}Xd\mathbb{P}=+\infty$$ Such as a Cauchy distribution. This could be a trouble.

But, I've seen that ALWAYS is a requisite of $X$ being integrable in order to define conditional expectation. If you have doubts, you could read the chapter 2 of "Basic Stochastic Processes", written by Zdzislaw Brzezniak

0
On

If $\operatorname{E}[X]=\infty $ then $\operatorname{E}[X^-]<\infty $ because $$ \operatorname{E}[X]:=\int_{\Omega }X^+\mathop{}\!d P-\int_{\Omega }X^-\mathop{}\!d P $$ where $X^-:=\max\{-X,0\}\geqslant 0$ and $X^+:=\max\{X,0\}\geqslant 0$. Also observe that $X^-\mathbf{1}_{A}\leqslant X^-$ for any measurable set $A$, and that $$ \{\omega\in \Omega : X(\omega )<Y(\omega )\}=\{\omega\in \Omega : X^+(\omega )<Y(\omega )+X^-(\omega )\}\subset \{\omega\in \Omega : X^+(\omega )<Y^+(\omega )+X^-(\omega )\}\\[1em] \therefore\quad X^+\mathbf{1}_{X<Y}=X^+\mathbf{1}_{X^+<Y+X^-}\leqslant X^+\mathbf{1}_{X^+<Y^++X^-}\leqslant Y^+ + X^- $$ Now, given an event $A$ of positive measure, it is defined the expectation respect to this event as $\operatorname{E}[X|A]:=\frac{\operatorname{E}[X\mathbf{1}_{A}]}{\Pr [A]}$, thus joining all together we have that $$ \begin{align*} |\operatorname{E}[X|X<Y]|&=\left|\frac{\operatorname{E}[X\mathbf{1}_{X<Y}]}{\Pr [X<Y]}\right|\leqslant \frac{\operatorname{E}[|X|\mathbf{1}_{X<Y}]}{\Pr [X<Y]}\\ &\leqslant \frac{\operatorname{E}[X^-\mathbf{1}_{X<Y}+X^+\mathbf{1}_{X<Y}]}{\Pr [X<Y]}\leqslant \frac{2\operatorname{E}[X^-]+\operatorname{E}[Y ^+]}{\Pr [X<Y]}<\infty \end{align*} $$ when $\Pr [X<Y]>0$.∎