Independence of Random samples

72 Views Asked by At

I have a some questions that have been bothering for a while now. First, how does one obtain the joint probability distribution function of $X_{1},\cdots ,X_{n}$? Would it be $\prod\limits_{i=1}^n F_{X_{i}}$? What about the marginal probability distributions? Is it just $F_{X}$ for all $i$?

Second, given two random variables, $X$ and $Y$, is it it true that $X$ and $Y$ are independent if and only if $F_{X}=F_{Y}$? I think it is not true, but I can't readily find counterexamples.

Any form of help would be appreciated.

Thanks.

1

There are 1 best solutions below

6
On BEST ANSWER

I assume $F_{X_i}$ is a shorthand for the cumulative distribution function $\Pr(X_i \le x_i)$ or something similar. As it is a function it would be better to show what it is a function of, for example $F_{X_i}(x_i)$.

If $X_{1},\ldots ,X_{n}$ are independent then the probability they are each less than the respective $x_i$ is indeed the product.

The marginal cumulative distribution functions are still $F_{X_i}(x_i)$. If the distributions are identical, you might consider dropping the $i$s.

Your line on $F_{X}=F_{Y}$ seems to confuse identical and independent. For independence, you want something like $F_{X|Y=y}(x)=F_{X}(x)$ for all $x$ and $y$: in other words, knowledge of $Y$ does not affect the distribution of $X$.

As a counter example just take any (non-singular) distribution for $Y$ and let $X=Y$. Then they obviously have the same cumulative distribution function but are not independent as $Y$ determines $X$ precisely.