Define 2 independent geometric variables
- X the number of days until Lily sends you a letter, with an average waiting time of $1/\lambda$ days $(0 < \lambda < 1, X ≥ 1)$
- Y the number of days until Noah sends you a letter, with an average waiting time of $1/\mu$ days $(0 < \mu < 1, Y ≥ 1)$
Let D be the time before either sends you a letter.
- What is the cumulative distribution function of D?
- Derive the p.m.f. of I defined by \begin{equation} I= \begin{cases} 0 & \text{if Lily's letter arrives strictly before Noah's}\\ 1 & \text{if both letters arrive on the same day}\\ 2 & \text{if Lily's letter arrives strictly after Noah's} \end{cases} \end{equation}
- Are $I$ and $D$ independent?
My Attempt
- Since they are both geometric distributions, we know that the expectation is $\mathbb{E}(X)=1/p$. So our parameters for $X$ and $Y$ are $\lambda$ and $\mu$ respectively. Then, we have $$F_X(t)=\mathbb{P}(X\leq t)=1-(1-\lambda)^t\quad\text{(known cdf)}$$ $$F_Y(t)=\mathbb{P}(Y\leq t)=1-(1-\mu)^t$$
Then we can make $D=min(X,Y)$. Then $$1-F_D(t)=\mathbb{P}(D>t)=\mathbb{P}(X>t)\mathbb{P}(Y>t)=(1-\lambda)^t(1-\mu)^t$$ So our cdf is \begin{equation} F_D(t)= \begin{cases} 0 & \text{if } t\leq0\\ (1-\lambda)^t(1-\mu)^t & \text{if } 0<t<\infty\\ 1 & \text{if } t=\infty \end{cases} \end{equation}
Representing $I$ in terms of $X,Y$: \begin{equation} I= \begin{cases} 0 &\text{if } X<Y\\ 1 & \text{if } X=Y\\ 2 & \text{if } X>Y \end{cases} \end{equation} $$\mathbb{P}(I=0)=\mathbb{P}(X<Y)=\sum_{x\in X} \sum_{y\in Y|x<y}\mathbb{P}(x,y)$$ Since $X,Y$ are independent $$\mathbb{P}(x,y)=\mathbb{P}_X(x)\mathbb{P}_Y(y)$$ $$=[(1-\lambda)^{k-1}\lambda][(1-\mu)^{k-1}\mu]$$ So $$\mathbb{P}(X<Y)=\sum_{x\in X} \sum_{y\in Y|x<y}[(1-\lambda)^{k-1}\lambda][(1-\mu)^{k-1}\mu]$$ I'm not sure how to compute $\mathbb{P}(X=Y)$, but I think $\mathbb{P}(X>Y)$ would be similar to the case above. For a pmf, do I have to compute the sum? Or do I just leave it as it is?
I'd assume you'd have to find the p.m.f. for I before determining independence.
$$P(X < Y) = \sum_{x=1}^\infty P(X=x, Y> x) = \sum_{x=1}^\infty P(X=x) P(Y>x) = \sum_{x=1}^\infty [(1-\lambda)^{x-1} \lambda] (1-\mu)^x.$$ If you plug in $P(Y>k) = \sum_{y=k+1}^\infty (1-\mu)^{y-1} \mu$ to my above expression you get a double sum that resembles what you have written, but this is not necessary since you already know $P(Y>x) = 1-F_Y(x)$. But I think you got confused by your own notation (you write $k$ instead of $x$ and $y$).
You should be able to simplify the above sum using geometric series.
$$\sum_{x=1}^\infty [(1-\lambda)^{x-1} \lambda] (1-\mu)^x = \lambda(1-\mu) \sum_{x=1}^\infty [(1-\lambda)(1-\mu)]^{x-1} = \frac{\lambda(1-\mu)}{1-(1-\lambda)(1-\mu)}$$
By a similar argument, $P(X > Y) = \frac{(1-\lambda)\mu}{1-(1-\lambda)(1-\mu)}$.
$P(X=Y)$ can be done similarly, just go through the cases $P(X=x, Y=x)$. You should get $\frac{\lambda\mu}{1-(1-\lambda)(1-\mu)}$.
$$P(I=0, D=d) = P(\text{no letters for $d-1$ days})P(\text{Lily but no Noah on on $d$th day})= (1-\lambda)^{d-1} (1-\mu)^{d-1} \cdot \lambda(1-\mu).$$
Check that this equals $P(I=0)P(D=d)$. Repeat for other values of $I$.