I'm quite confused about how to find a limiting distribution if you're given a minimum order statistic from a random sample. If you can perhaps explain the general steps and theory with this example that would be great.
Let $(X_1,...X_n)$ be a random sample from uniform(0,1). Let $X^n_{(0)}$ be the minimal order statistic for this random sample. What is the limiting distribution of $X^n_{(0)}$ as n tends to infinity?
The probability that the minimum is bigger than a certain value is the probability that each variable is bigger than this value. In other words:
$P(\min(X_1,...,X_n)>x)=P\Big(\bigcap\limits_{i=1}^n\{X_i>x\}\Big)=P(X>x)^n=(1-x)^n.$
As $n$ tends to infinity, the probability of the minimum being bigger than $0$ tends to $0$. Since the probability of being smaller than $0$ is also $0$, we can conclude that as $n$ tends to infinity, $X_{(0)}^n$ will become $0$ with probability $1$.