Why is the second derivative of the probability distribution function in this case not valid probability density function?

227 Views Asked by At

I am asked to find the covariance of $X$ and $Z=min(X,Y)$, where $X$ has exponential distribution $\epsilon(2)$, $Y$ has exponential distribution $\epsilon(3)$ and $X$ and $Y$ are independent. I am trying to find $E(XZ)$. To do this I calculate $$F_{X,Z} (x,z) = p \{X \leq x, Z \leq z \} = p\{X \leq x \} = 1 - e^{-2x}$$, in case $x \leq z$ and $$F_{X,Z} (x,z) = p\{ X \leq z\} + p\{z<X\leq x, Z \leq z \} = (1 - e^{-2z}) + (1 - e^{-2x} - 1 + e^{-2z})(1 - e^{-3z}) = 1 - e^{-2x} - e^{-5z} +e^{-2x -3z}$$, otherwise.

So density function is $F_{X,Z}$ derivated with respect to $x$ and $z$: $$f_{X,Z}(x,z) = 6e^{-3z-2x}$$ when $x > z$, and $0$ otherwise.

But when I integrate this: $$\int_{0}^{+\infty} \int_{0}^{x} 6 e^{-3z-2x} dzdx = \frac{3}{5}$$

I don't get 1. Why ?

2

There are 2 best solutions below

0
On

$(X,Z)$ is not jointly continuously distributed, so one cannot obtain its density by pointwise differentiation of its CDF. The actual source of the error is that $F$ is not even differentiable at a point $(x,x)$ with $x>0$. You can see this by looking at the directional derivatives, e.g. comparing the unnormalized directional derivative in the directions $(1,0),(0,1)$ and $(1,1)$. The first two should add up to the third if $F$ were differentiable, but they don't.

0
On

There is no joint density of $(X,Z)$ (with respect to Lebesgue measure) because $$P(X=Z)=P(X=\min(X,Y))=P(X<Y)>0$$

And we know that Lebesgue measure (or area) of the set $\{(x,z):x=z\}$ is $0$.

In other words, $(X,Z)$ is not absolutely continuous. Joint distribution function of $(X,Z)$ exists of course, but you don't get a density function by differentiating the distribution function.

Now using an indicator function $I$, one can write $$Z=\min(X,Y)=XI_{X<Y}+YI_{X>Y}$$

Therefore,

$$E(XZ)=E(X^2I_{X<Y})+E(XYI_{X>Y})$$

So, if $f_{X,Y}$ is the joint density of $(X,Y)$, both expectations above can be found using this theorem:

$$E(X^2I_{X<Y})=\iint_{x<y} x^2 f_{X,Y}(x,y)\,\mathrm{d}x\,\mathrm{d}y$$

$$E(XYI_{X>Y})=\iint_{x>y} xy \,f_{X,Y}(x,y)\,\mathrm{d}x\,\mathrm{d}y$$