Joint probability density function of the sum of dependent random vectors

70 Views Asked by At

I have a random variable $\beta$, a random vector $U = [ U_1, U_2]^T$ and its marginal probability density functions $f_{\beta} (\beta)$, $f_{U_1}(u_1)$ and $f_{U_2}(u_2)$, respectively.

The random vector $V$ consists on the product of the random variable $\beta$ by the random vector $U$, i.e.

$$ V = \left[ \begin{matrix} V_1 \\ V_2 \end{matrix} \right] = \beta U = \left[ \begin{matrix} \beta U_1 \\ \beta U_2 \end{matrix} \right] $$

The joint probability density function of $V$ can be evaluated by its CDF with the relation:

$$ F_{V_1,V_2}(v_1,v_2) = Pr(U_1<v_1/\beta, \: U_2<v_2/\beta, \: \beta > 0) + Pr(U_1>v_1/\beta, \: U_2>v_2/\beta, \: \beta < 0)\\ f_{V_1,V_2}(v_1,v_2) = \frac{\partial F_{V_1,V_2}(v_1,v_2)}{\partial v_1 \partial v_2}\\ f_{V_1,V_2}(v_1,v_2) = \int_{-\infty}^\infty \frac{f_\beta(\beta)}{\beta^2} f_{U_1} \left( \frac{v_1}{\beta} \right) f_{U_2} \left( \frac{v_2}{\beta} \right) \, d\beta $$

Then, I have a random vector $X$ defined as

$$ X = \left[ \begin{matrix} X_1 \\ X_2 \end{matrix} \right] = V + W = \left[ \begin{matrix} V_1 + W_1 \\ V_2 + W_2 \end{matrix} \right] $$ with known marginals $f_{W_1}(w_1)$ and $f_{W_2}(w_2)$.

How can I evaluate the joint probability density function of the vector $X$?

I have tried to use the same CDF approach adopted above, obtaining the following expression

$$ f_{X_1,X_2}(x_1,x_2) = \int_{-\infty}^{\infty}f_{W_1}(\tau)f_{V_1}(x_1-\tau) \, d\tau \int_{-\infty}^{\infty}f_{W_2}(\nu)f_{V_2}(x_2-\nu) \, d\nu $$

But I think that it is true only if $V_1$ and $V_2$ are independent. And then, if they share the same $\beta$ r.v. on its structure, they are dependent variables. Am I right?

Edit: $U_1$, $U_2$, $\beta$, $W_1$ and $W_2$ are independent random variables.