When two distributions have the same variance and shape, do we call them identically distributed, regardless of their mean (as this is usually a location parameter)?
What does "identically distributed" mean?
21.6k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 4 best solutions below
On
Two (real-valued) random variables $X$ and $Y$ are identically distributed if $$ P(X \leq x) = P(Y \leq x) $$ for all $x \in \mathbb{R}$.
On
The probability assigned to each event in the set must be the same. In the case that real numbers are the set of outcomes then Artems answer is good, but not all probabilities are assigned to real numbers! For example tossing two identical coins $A,B$ then if
$$\cases{P(A=heads) = P(B=heads)\\P(A=tails) = P(B=tails)}$$
and we know that $\Omega = \{heads,tails\}$ then they are identcally distributed. But neither "heads" or "tails" are real numbers.
On
We never call two distributions identically distributed.
We eventually do call two random variables $X,Y$ identically distributed.
This if they have the same distribution, i.e. if $P_X=P_Y$ where $P_X$ stands for the induced probability prescribed by $A\mapsto P(\{X\in A\})$ on Borel-measurable sets $A$.
A sufficient condition for this is $F_X=F_Y$ where $F_X$ denotes the CDF of $X$.
This condition is also necessary since $F_X$ is actually a restriction of $P_X$ to specific Borel-measurable sets. This fact is less interesting, but worth mentioning as the comments on this question show.
Things like mean, variance and shape (I suspect you mean the graph of an eventual PDF) are - if they exist - completely determined by the distribution.
In probability theory and statistics, a sequence or other collection of random variables are independent and identically distributed (i.i.d.) if each random variable has the same probability distribution as the others and all are mutually independent.