Let $(\xi_n)_n$ be an i.i.d sequence of random variables such that $\xi_0\sim exp(1)$. Let $T_0=0$ and $$ T_n:=\inf\{j> T_{n-1}; \xi_j>\xi_{T_{n-1}}\}. $$
I'm interested in understanding the process of records $Y_n=\xi_{T_n}$. In particular, the distribution of $X_n:=\xi_{T_n}-\xi_{T_{n-1}}$. In this question the OP links an image with the following lemma
Lemma 1.1. There exists a sequence $(\xi'_n)_n$ of i.i.d r.v's with $\xi'_0\sim exp(1)$ such that the sequence of records $Y_n$ can be written as $$ Y_n=\sum_{i=1}^n\xi_i' $$
and the OP says that it is a proof by Resnick, however the OP doesn't tell what is the exact reference and they are not responsive in the comments. Could someone give me the reference to this book/paper?
In the chapter 4 of the book Extreme Values, Regular Variation, and Point Process, Resnick seems only to study the case of hitting times $$ R_n:=\inf\{j> R_{n-1}; \xi_j\in B\}, $$ where $B$ is a fixed event.
I would also be very interested in any other reference related to the process $Y_n$.
EDIT: In response to Misha Lavrov.
I first thought that memoryless would give me that $X_n\sim exp(1)$,
\begin{align*} \mathbb{P}(X_k\leq t)&=\mathbb{P}(\xi_2-\xi_1\leq t|\xi_2>\xi_1)=\int_0^\infty \frac{\mathbb{P}(\xi_2\leq t+s, \xi_2>s)}{\mathbb{P}(\xi_2>s)} e^{-s}ds\\& =\int_0^\infty \frac{e^{-s}(1-e^{-t})}{e^{-s}}e^{-s}ds=1-e^{-t}. \end{align*}
But i am not sure if the first equality holds. My intuition is that the larger $Y_n$ gets, the harder is to beat it, and when it happens, $Y_n$ is beaten by a smaller margin each time.