While one can find various online references (and in books) for the definition of "convergence in distribution", I have several times today been stumped (as a non-probabilist) by the notion of "identical/equal in distribution" or "identity/equality in distribution". This seems to be notated with $\stackrel{d}{=}$. Does it mean something like equal almost everywhere in analysis? Are there any easy/obvious examples someone with only a first course in probability could understand?
I could not find a reference in the books in front of me (unfortunately I didn't have Feller handy) nor on the internet, including this site - the latter of which I found pretty surprising. But I'll be happy to close this if it's a dup.
Depending on how much you know about probability theory this might be of help.
A random variable $X$ has a probability distribution function ("cumulative distribution function" or CDF), denoted $F_X$, for which $P(X\le t) = F_X(t)$. Two different random variables $X$ and $Y$ can have the same probability distribution functions, in which case we say they are "equal in distribution", written $X\stackrel{d}{=}Y$, or $F_X=F_Y$.
What is a random variable? A measurable function on a probability measure space $(\Omega,\mathcal F, P)$. The distribution function of the random variable $X$ is then $F_X(t)=P(\{\omega\in\Omega: X(\omega)\le t\})$. The strange thing about equality in distribution is that $X$ and $Y$ might be functions on completely different probability spaces and yet be equal in distribution.
For example, suppose you and I each have a fair coin. You flip yours, and report the result $X$ as a $0$ or $1$; I flip mine similarly to get $Y$. Our two results are equal in distribution, even though there is no claim that $X=Y$.