Conditional distribution function of one random variable given the sum of two

417 Views Asked by At

I am trying to solve the following exercise in Probability Theory by A. Klenke (3rd version).

Let X and Y be independent exponential random variables for some $\theta>0$. Compute $P[X \leq x | X+Y]$ for $x\geq0$.

My solution is based only on the definition of conditional expectation and in particular on this property: if $\mathbb{E}[\mathbb{1}_A X]=\mathbb{E}[\mathbb{1}_A\mathbb{E}[X|\mathcal{F}]]$ for every $A \in \mathcal{F}$ then $\mathbb{E}[X|\mathcal{F}]$ is called a conditional expectation, where $X\in\mathcal{L}^1(\Omega, \mathcal{A},\mathbb{P})$ and $\mathcal{F}\subset \mathcal{A}$ are two $\sigma$-algebras.

Thus, for every $A\in \sigma(X+Y)$:

$\int_A \mathbb{1}_{X(\omega)\in[0,x]} d\mathbb{P}=\int_A\mathbb{1}_{X(\omega)\in[0,x]} d(\mathbb{P}\circ(X \times (X+Y))^{-1})=\int_A\int_0^t\mathbb{1}_{t-y\in[0,x]} \theta e^{-\theta(t-y)}\theta e^{-\theta y}dydt=\int_A \int_{t-x}^{t} \theta^2e^{-\theta t}dydt=\int_A \frac{x}{t} t\theta^2e^{-\theta t}dt = \int_A \frac{x}{T} d\mathbb{P}$.

So I conclude: $P[X \leq x | X+Y] = \frac{x}{X+Y}$.

In the second equality I obtained the density of $(X,T)$, where $T=X+Y$, in this way: $f_{X,T}(x,t)=f_{X,Y}(t-y,y)=f_X(t-y)f_Y(y)$ by the independence property.

Is this correct?

Edit

Taking in the comments made by @D Ford, if I define $T=X+Y$, then this is the correct chain of equalities:

$\int_A \mathbb{1}_{X\in[0,x]}(\omega) d\mathbb{P}= \\ \int_{T(A)}\mathbb{1}_{[0,x]}(X) d(\mathbb{P}\circ(X \times (X+Y))^{-1})=\\ \int_{T(A)}\int_0^t\mathbb{1}_{t-y\in[0,x]} \theta e^{-\theta(t-y)}\theta e^{-\theta y}dydt=\\ \int_{T(A)} \int_{t-x}^{t} \theta^2e^{-\theta t}dydt=\\ \int_{T(A)} \frac{x}{t} t\theta^2e^{-\theta t}dt = \\ \int_A \frac{x}{T} d\mathbb{P}$.

2

There are 2 best solutions below

0
On

In the second equality I obtained the density of (X,T), where T=X+Y, in this way: $f_{X,T}(x,t)=f_{X,Y}(t−y,y)=f_X(t−y)f_Y(y)$ by the independence property.

Is this correct?

No. You have the right idea, but you begin with a function of $x$ and $t$, so should not end with a function of $t$ and $y$.

Rather:

$$\begin{align}f_{X,T}(x,t)&=f_{X,Y}(x,t-x)\\&=f_X(x)\cdot f_Y(t-x)\\&= \theta^2\mathrm e^{-\theta x}\mathrm e^{-\theta (t-x)}\mathbf 1_{0\leq x}\mathbf 1_{0\leq t-x}\\&=\theta^2\mathrm e^{-\theta t}\,\mathbf 1_{0\leq x\leq t}\end{align}$$


And similarly we might obtain the same result:.

$$\begin{align}\mathsf P(X\leq x\mid X+Y=t) &=\dfrac{\int_0^x f_{X,Y}(s,t-s)\,\mathrm d s}{\int_0^t f_{X,Y}(s,t-s)\,\mathrm d s}\mathbf 1_{0\leq x\lt t}+\mathbf 1_{t\leq x}\\[2ex]&=\dfrac{\theta^2\mathrm e^{-\theta t}\int_0^x \mathrm ds }{\theta^2\mathrm e^{-\theta t}\int_0^t \mathrm ds}\,\mathbf 1_{0\leq x\lt t}+\mathbf 1_{t\leq x}\\[2ex]&=\dfrac{x}{t}\,\mathbf 1_{0\leq x<t}+\mathbf 1_{t\leq x}\end{align}$$

So I conclude: $P[X≤x\mid X+Y]=x/(X+Y)$.

0
On

Here's another solution to this problem. If we know $g : \mathbb R^2 \to \mathbb R$ is continuous (as we suspect the joint density of $X$ and $X+Y$ to be), then we can use the fundamental theorem of calculus: $$ g(x,y) = \frac{\partial^2}{\partial x \partial y} \int_0^x \int_0^y g(s,t) \, ds \, dt. $$ In this case, to find the joint density of $X$ and $X+Y$, we first observe: $$ \mathbb 1_{\{X \leq x\} \cap \{X+Y \leq z\}} = \mathbb 1_{\{X \leq x \} \cap \{Y \leq z - X\}} = \mathbb 1_{A(x,z)}(X,Y), $$ where $A(x,z) = \{(s,t) \in \mathbb R^2 : 0 \leq s \leq x, 0 \leq z \leq y-s\}$. So we compute: \begin{align*} \mathbb P \left[ \{X \leq x\} \cap \{X+Y \leq z\}\right] &= \int \mathbb 1_{A(x,z)}(X,Y) \, d\mathbb P \\ &= \int_{A(x,z)} d\left(\mathbb P \circ(X \times Y)^{-1}\right) \\ &= \int_0^x \int_0^{z-s} \theta^2 e^{-\theta(s+t)} \, dt \, ds \\ &= 1 - e^{-\theta x} - \theta x e^{-\theta z}. \end{align*} Differentiating this with respect to $x$ and $z$, and noting $X, Y \geq 0$ and $\mathbb P[\{X > z\} \cap \{X+Y \leq z\}] = 0$, we find that the joint density $f$ of $X$ and $X+Y$ is $$ f(x,z) = \theta^2 e^{-\theta z} \mathbb 1_{[x,\infty)}(z) \mathbb 1_{[0,\infty)}(x). $$ This joint density, along with part (i) of this exercise, can be used to compute both $\mathbf E[X|X+Y]$ and $\mathbf P[X \leq x | X+Y]$.