independence of gamma random variables - is this correct?

176 Views Asked by At

Suppose $X \sim \Gamma[n_1,\lambda], Y \sim \Gamma[n_2,\lambda]$, and $X+Y \sim \Gamma[n_1 + n_2, \lambda]$

Can we say that $X$ and $Y$ are independent?

Here's what I think: Suppose $f,g$ are p.d.f.'s of $X,Y$, and $h$ be their joint p.d.f. It is basically saying

$$ \forall z>0,\int_0^z h(x,z-x)dx = \int_0^z f(x)g(z-x)dx $$

Does it infer $h(x,y) = f(x)g(y)$ a.e.?

Well, I know that if $$ \int_0^xf(u)du = \int_0^x g(u)du$$ By taking the derivative of $x$, I suppose it means $f = g$ a.e.

However, I don't know how to take the derivative of $z$ in the previous formula, since $x$ appears at the same time with $z$. If I take the derivative of $z$, since $z-z=0$, I'll get $h(z,0) = f(z)g(0)$, which doesn't seem to help.

We know that if $X$, $Y$ are independent, the additivity holds. Now I'm doing a problem that requires to prove the independence of two r.v.'s. I found that they follow Gamma distributions and they obey additivity, but I'm not sure if it is legitimate to say these two r.v.'s are independence.

Thanks!

1

There are 1 best solutions below

3
On BEST ANSWER

In terms of characteristic functions, the question is whether variables are independent if the product of their characteristic functions is equal to characteristic function of their sum: $$\mathbb E[e^{itX}]\cdot \mathbb E[e^{itY}] = \mathbb E[e^{it(X+Y)}].\tag{1}$$ We know that the variables are independent iff for all $t,s\in \mathbb{R}$ $$\mathbb E[e^{itX}]\cdot \mathbb E[e^{isY}] = \mathbb E[e^{i(tX+sY)}],$$ which is much stronger than $(1)$. So it should be obvious that the answer is negative. But it is not so easy to construct a particular example.


I will construct an example in the case where $n_1 = n_2 = 1$, $\lambda = 1$, so that we have $\mathrm{Exp}(1)$ variables (if you'll understand this construction, then you'll be able to make a similar construction in arbitrary case).

Let us first reduce this problem to a discrete case, which is easier to handle. It is well known (and easy to prove) that for an exponentially distributed random variable $X$, its integer part $\lfloor X\rfloor$ and its fractional part $\{X\}$ are independent and the integer part has shifted geometrical distribution: $$ p_n := \mathbb{P}(\lfloor X\rfloor = n) = e^{-n}(1-e^{-1}), n\ge 0. $$


So we want to construct two dependent variables $X'$, $Y'$ such that $$\mathbb{P}(X' = n) = \mathbb{P}(Y' = n) = p_n, n\ge 0,\tag{2}$$ and $$\mathbb{P}(X'+Y' = n) = \sum_{k=0}^n p_kp_{n-k}, n\ge 0.\tag{3}$$ Start by setting $p_{k,j} = p_kp_j$. Now, for some very small $\epsilon>0$ distort six values: \begin{eqnarray} & & p'_{1,0}= p_{1,0}+\epsilon\quad & p'_{2,0}= p_{2,0}-\epsilon\\ & p'_{0,1}= p_{0,1}-\epsilon\quad & & p'_{2,1}= p_{2,1}+\epsilon \\ & p'_{0,2}= p_{0,2}+\epsilon\quad & p'_{1,2}= p_{1,2}-\epsilon & \end{eqnarray} and set $\mathbb{P}(X' = k,Y'=j) = p'_{k,j}$ for these six values and $\mathbb{P}(X' = k,Y'=j) = p_{k,j}$ otherwise. It is easy to see that such $X',Y'$ are dependent and satisfy $(2)$ and $(3)$.


Finally, let $X'',Y''$ be independent random variables, which are independent of $X'$, $Y'$ and have the same distribution as fractional part of $\mathrm{Exp}(1)$.

Setting $X = X'+X''$, $Y = Y'+Y''$, we arrive at the required example.