The joint probability of random variables that follow the same distribution.

72 Views Asked by At

Let's say that we have 3 random variable for which we know which distribution they follow. Say $X_{1}, X_{2} \sim N(\mu,\sigma^2)$ and $A \sim N(a,\sigma^2)$ and we know for sure that $a \neq \mu$. What can we say for the joint distributions?: $P(X_{1},A)$

$P(X_{2},A)$

does it hold that $P(X_{1},A) = P(X_{2},A)$ since $X_{1},X_{2}$ follow the same distribution? If not why not? Can there any other relation be said for the joint distributions if the above does not hold?

Edit: I should also mention that $X_{1},X_{2}$ are both correlated with A in the same way. For example if $A = a_{1}+a_{2}+...a_{n}$ is the sum of $n$ random variables, then $X_{1},X_{2}$ both share the same subset of A. For example $X_{1} = a_{1} + x_{2}^{1} + ... + x_{n}^{1}$, $X_{2} = a_{1} + x_{2}^{2} + ... + x_{n}^{2}$

1

There are 1 best solutions below

2
On BEST ANSWER

As @DilipSarwate pointed out, knowledge of correlation (the numerical value) is not sufficient to specify the joint distribution in general.

However, when you say "$X_1$ and $X_2$ are correlated with $A$ in the same way," perhaps you mean something much stronger than just having the same correlation (the numerical values). Specifically, if:

  • $A = A_1 + A_2$

  • $X_1 = A_1 + Y_1$

  • $X_2 = A_1 + Y_2$

  • $A_1, A_2, Y_1, Y_2$ are mutually independent

  • $Y_1, Y_2$ are identically distributed

then $P(X_1, A) = P(X_2, A)$ by symmetry. Here each of $A_1, A_2, Y_1, Y_2$ can be a sum of other r.v.s (perhaps your $a$'s and $x$'s) which can be Gaussians or not.

Not all the preconditions are actually necessary (e.g. I would think $A_2$'s independence is unnecessary), but given the way you write the little $a$'s and $x$'s, I suspect the above fits your situation. Does this help?