I have the same question here which I asked a few days ago, but now I have really worked on it and I could not get the solution, so I need some more help.
Let $X_1, X_2, \ldots, X_n$ be i.i.d. random variables with continuous distribution function $F(x) = 1-\exp(-x)$, $x > 0$. Furthermore, define $N(1) := 1$ and for $n \ge 2$, \begin{align*} N(n) := \min\{j : j > N(n-1), X_j > X_{N(n-1)}\}. \end{align*} Then, \begin{align*} \limsup_{n \to \infty} \frac{|\log N(n) - X_{N(n)}|}{\log n} = 1 \end{align*} almost surely.
In the textbook by Galambos is the following proof strategy, which I tried to understand.
The random variables $N(n)$ are conditionally independent, given the sequence $X_{N(k)}$, $k \ge 1$.
The conditional distribution of $N(n)$ is \begin{align*} P(N(n) > t \mid X_{N(k)}, k \ge 1) = (1 - \exp(- X_{N(n)}))^t. \end{align*}
From this fact, one gets at once by a Borel-Cantelli type of argument that \begin{align*} \limsup_{n \to \infty} \frac{|\log(N(n) \exp(-X_{N(n)}))|}{\log n} = 1 \end{align*} almost surely, given the sequence $\{X_{N(k)}, k \ge 1\}$.
But, since the $\limsup$ is a constant, the same result is obtained if the condition is dropped.
What I have so far is 2.: \begin{align*} P(N(n) > t \mid X_{N(k)}, k \ge 1) &= P(\max\{X_1, \ldots, X_t\} < X_{N(n)}\mid X_{N(k)}, k \ge 1) \\ &= P(Z_t < X_{N(n)} \mid X_{N(k)}, k \ge 1) = (F(X_{N(n)}))^t \\ &= (1 - \exp(- X_{N(n)}))^t, \end{align*} where $Z_t := \max\{X_1, \ldots, X_t\}$.
Then, for 1. I need to show that \begin{align*} &P(N(n_1) \le t_1, N(n_2) \le t_2 \mid X_{N(k)}, k \ge 1) \\ &= P(N(n_1) \le t_1 \mid X_{N(k)}, k \ge 1) \cdot P(N(n_2) \le t_2 \mid X_{N(k)}, k \ge 1). \end{align*} One has \begin{align*} &P(N(n_1) \le t_1 \mid X_{N(k)}, k \ge 1) \cdot P(N(n_2) \le t_2 \mid X_{N(k)}, k \ge 1) \\ &= P(Z_{t_1} \ge X_{N(n_1)}\mid X_{N(k)}, k \ge 1) \cdot P(Z_{t_2} \ge X_{N(n_2)}\mid X_{N(k)}, k \ge 1) \\ &= (1 - P(Z_{t_1} < X_{N(n_1)}\mid X_{N(k)}, k \ge 1)) (1 - P(Z_{t_2} < X_{N(n_2)}\mid X_{N(k)}, k \ge 1)) \\ &= (1 - (F(X_{N(n_1)}))^{t_1}) (1 - (F(X_{N(n_2)}))^{t_2}) \\ &= (1 - (1 - \exp(- X_{N(n_1)}))^{t_1}) (1 - (1 - \exp(- X_{N(n_2)}))^{t_2}) \\ &= 1 - (1 - \exp(- X_{N(n_2)}))^{t_2} - (1 - \exp(- X_{N(n_1)}))^{t_1} \\ &\quad + (1 - \exp(- X_{N(n_1)}))^{t_1}(1 - \exp(- X_{N(n_2)}))^{t_2} \end{align*} I do not see how I can get the other end of the equation.
Then, for 3., I wanted to use a fact, which follows from the Borel-Cantelli Lemma:
Let $X$ be a random variable and $(X_n)$ be a sequence of random variables over some probability space $(\Omega, \mathcal F, P)$. If $\sum_{n \ge 1} P(|X_n - X| > \epsilon) < \infty$ for any $\epsilon > 0$, then $X_n \to X$ almost surely.
If I did not make a mistake, we have \begin{align*} &\left\{\left|\frac{|\log N(n) - X_{N(n)}|}{\log n} - 1\right| > \epsilon\right\} = \left\{||\log N(n) - X_{N(n)}| - \log n| > \epsilon \log n\right\} \\ &= \left\{|\log N(n) - X_{N(n)}| > (\epsilon+1) \log n\right\} \cup \left\{|\log N(n) - X_{N(n)}| < (1 - \epsilon)\log n\right\} \\ &= \left\{N(n) > n^{\epsilon+1} \exp(X_{N(n)})\right\} \cup \left\{N(n) < \exp(X_{N(n)})n^{-\epsilon-1}\right\} \cup \left\{N(n) < n^{1 - \epsilon} \exp(X_{N(n)})\right\} \\ &\quad \cup \left\{N(n) > \exp(X_{N(n)}) n^{-1 + \epsilon}\right\}. \end{align*} So I wanted to split up the series in four single series and to show that they converge. But I can do this only with the first series (from which one can get a geometric series), but not with the other ones. My feeling is that there is a much simpler solution for this task. 4. is clear to me.
Thank you for any kind of help.