On pp. 370-374 (see this previous question) of Cramer's 1946 Mathematical Methods of Statistics, the author shows that for any continuous distribution $P$, if $X_1, \dots, X_n \sim P$ are i.i.d., then the RV:
$$ \Xi_n \overset{def}{=} n (1 - F(M_n)) \,, \quad \text{where }M_n \overset{def}{=} \max_{1 \le i \le n} X_i \,, $$
has the same distribution as the maximum of $n$ i.i.d. RV's $U_1, \dots, U_n$, where $U_i \sim \operatorname{Uniform}(0,n)$, inasmuch as he shows that they have the same density (cf. p.49 of Keener, Theoretical Statistics). (Here $F$ of course denotes the CDF of each of the observations $X_i$.)
Cramer says later that we can assume $\Xi_n$ "is bounded" - but in what sense precisely is that true?
Question: What is the almost sure rate of growth of the random variables $\Xi_n$? Of $\log(\Xi_n)$?
In particular, is it true that $\log(\Xi_n) = o(\log\log n)$ almost surely?
$\log(\Xi_n) = o(\log\log n)$ a.s. appears to be sufficient to show that $\max_{1 \le i \le n} X_i \sim \sqrt{2 \log n}$ a.s. where the $X_i \sim \mathscr{N}(0,1)$, using the argument found here, hence my interest.
In a community wiki "answer" below I have posted what I believe to be a proof that $\Xi_n = o(\log n)$ a.s., at least when the $X_i \sim \mathscr{N}(0,1)$, an assumption which would be considered acceptable for any accepted answer. Note that, it appears to be sufficient to show that for all $\varepsilon >0$, $\Xi_n = o((\log n)^{\varepsilon})$ a.s. in order to conclude $\log (\Xi_n) = o(\log \log n)$ a.s.
Note: For each $n$, $\Xi_n$ has the distribution of the maximum of $n$ i.i.d. uniforms supported on $[0,n]$ (not $[0,1]$ or $[0, \theta]$ for some fixed constant), so this problem does not seem completely trivial.
Note 2: Following the argument below, it seems it suffices to show that (assuming this is even true), for any $\epsilon >0$, $\delta > 0$ that:
$$ \sum_{n=1}^{\infty} \frac{(\log n)^\delta}{n (\exp ((\log n)^\delta ))^\epsilon } < \infty \,.$$
I haven't been able to figure out how to control (i.e. bound) the $\exp ( (\log n)^\delta )$ term/factor usefully. (Comparing with this, the case where $\alpha=1$ and $\beta < 1$, seems to show that this is actually false for $\delta < 1$ or small enough. Which would be OK since this would only be sufficient, not necessary, anyway.)
Note 3: If it really is true that $\Xi_n$ is stochastically dominated by an exponential random variable (with parameter $1$) for all $n$, then maybe it's possible to get the necessary asymptotic probabilistic bound more easily for exponentials and then use the stochastic dominance to conclude.
By the setting, $V_n = 1 - F(U_n)$ is a sequence of i.i.d. random variables with $V_n \sim \operatorname{Uniform}([0,1])$. Write
$$ \xi_n = \min\{V_1, \cdots, V_n\}, \qquad \Xi_n = n \xi_n.$$
Here we prove the following claim.
As usual, we show that one bounds the other.
Proof of $\ ``\leq\text{''}$. We first fix $\epsilon \in (0, 1)$ and $\alpha > 1$, and set $n_k = \lfloor \alpha^k \rfloor$. Then from the inequality $ \mathbb{P}( \Xi_n > t) = \left( 1 - \frac{t}{n} \right)^n \leq e^{-t} $, it follows that
$$ \sum_{k=1}^{\infty} \mathbb{P}(\Xi_{n_k} > (1+\epsilon)\log\log n_k) \leq \sum_{k=1}^{\infty} \frac{1}{\left( \log n_k \right)^{1+\epsilon}} < \infty $$
since $(\log n_k)^{-1-\epsilon} \asymp k^{-1-\epsilon}$ as $k\to\infty$. By Borel-Cantelli's lemma, $\Xi_{n_k} \leq (1+\epsilon)\log\log n_k$ eventually holds a.s. But for each $n \in [n_k, n_{k+1}]$, we have
$$ \frac{\Xi_{n}}{\log\log n} = \frac{n \xi_n}{\log\log n} \leq \frac{n_{k+1} \xi_{n_{k+1}}}{\log\log n_k} = \frac{n_{k+1}\log\log n_{k+1}}{n_k \log\log n_k} \frac{\Sigma_{n_k}}{\log\log n_{k+1}}, $$
and so, taking limsup as $k\to\infty$ shows that
$$ \limsup_{n\to\infty} \frac{\Xi_n}{\log\log n} \leq \alpha(1+\epsilon) \quad \text{a.s.} $$
But since this is true for any $\epsilon > 0$ and $\alpha > 1$, letting $\epsilon \to 0$ and $\alpha \to 1$ along subsequence (so that we only care about countable intersection of events) proves the desired inequality.
Proof of $\ ``\geq\text{''}$. This part is much harder. Define $X_n$ and $T_n$ as follows:
$T_0 = 0$, and $T_{n} = \inf \{ m > T_{n-1} : \xi_m < \xi_{T_{n-1}} \}$ for $n \geq 1$. (Here, the convention $\xi_0 = 1$ is used.) In other words, $T_n$'s record the $n$-th time instance at which $\xi_n$ becomes smaller.
$X_0 = 1$, and $X_{n} = \xi_{T_{n}} / \xi_{T_{n-1}}$.
Then $(X_n)_{n=1}^{\infty}$ are i.i.d. with $X_n \sim \operatorname{Uniform}([0,1])$, and $T_{n} - T_{n-1} \sim \operatorname{Geometric}(X_0 \cdots X_{n-1})$ for $n \geq 1$. Now we fix $\epsilon \in (0, 1)$ and focus on the extreme behavior of
$$\tau_n := X_1 \cdots X_{n-1} (T_n - T_{n-1}).$$
To this end, write $\mathcal{F}_n = \sigma( X_0, \cdots, X_n, T_0, \cdots, T_n)$ for the natural filtration of $(X_n)$ and $(T_n)$. Then $\tau_n$ is $\mathcal{F}_n$-measurable, and
$$ \mathbb{P}(\tau_n > (1-\epsilon)\log n \mid \mathcal{F}_{n-1}) \geq \left( 1 - X_1\cdots X_{n-1} \right)^{\frac{(1-\epsilon)\log n}{X_1 \cdots X_{n-1}}} $$
By the strong law of large numbers, $X_1 \cdots X_n = e^{-(1+o(1))n}$, and so, this lower bound is $\asymp n^{-(1-\epsilon)}$ a.s. This tells that $\sum_{n=1}^{\infty} \mathbb{P}(\tau_n > (1-\epsilon)\log n \mid \mathcal{F}_{n-1}) = \infty$ a.s., and so, by the generalized second Borel-Cantelli lemma, it follows that
$$ \tau_n > (1-\epsilon)\log n\quad \text{infinitely often} \quad \text{a.s.} \tag{1} $$
On the other hand, for $n \geq 3$,
$$ \mathbb{P}(\tau_n > 2019\log n) \leq \mathbb{E} \left[ \left( 1 - X_1\cdots X_{n-1} \right)^{\frac{2019\log n}{X_1 \cdots X_{n-1}}-1} \right] \leq \frac{c}{n^{2019}} $$
where $c > 0$ is an absolute constant. (We can pick $c = \mathbb{E}[(1-X_1 X_2)^{-1}] = \pi^2/6$, for instance, although this value is not important.) So by the Borel-Cantelli lemma, $\mathbb{P}(\tau_n \leq (1+\epsilon)\log n \text{ i.o}) = 1$. Moreover, by the strong law of large numbers again, $\mathbb{P}(X_1 \cdots X_n \geq e^{-(1+\epsilon)n} \text{ eventually}) = 1$. Therefore
$$ T_n = \sum_{k=1}^{n} \frac{\tau_k}{X_1 \cdots X_k} \leq \sum_{k=1}^{n} e^{(1+\epsilon)k} 2019 \log k + \mathcal{O}(1) \leq C e^{(1+\epsilon)n}\log n $$
for some random variable $C$ which a.s. finite and positive. Using this inequality,
$$ \frac{\Xi_{T_n - 1}}{\log\log T_n} = \frac{X_1 \cdots X_{n-1} (T_n - 1)}{\log \log T_n} \geq \frac{\tau_n}{\log n + O(1)}. \tag{2} $$
Combining $\text{(1)}$ and $\text{(2)}$, it follows that
$$ \limsup_{n\to\infty} \frac{\Xi_n}{\log\log n} \geq \limsup_{n\to\infty} \frac{\Xi_{T_n - 1}}{\log \log T_n} \geq 1 - \epsilon \quad \text{a.s.}, $$
and since this is true for any $\epsilon \in (0, 1)$, the desired inequality follows. $\square$